Apr 24 23:53:19.747483 ip-10-0-139-62 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:20.219254 ip-10-0-139-62 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:20.219254 ip-10-0-139-62 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:20.219254 ip-10-0-139-62 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:20.219254 ip-10-0-139-62 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:20.219254 ip-10-0-139-62 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:20.221139 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.221048 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:20.226231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226191 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:20.226231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226228 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:20.226231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226233 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:20.226231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226237 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226241 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226244 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226247 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226251 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226255 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226259 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226262 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226265 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226268 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226270 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226274 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226276 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226279 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226281 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226284 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226287 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226289 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226292 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:20.226407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226294 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226297 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226305 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226307 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226310 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226313 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226315 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226318 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226321 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226323 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226326 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226329 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226331 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226335 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226337 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226340 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226343 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226345 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226350 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:20.226844 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226354 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226357 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226360 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226362 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226365 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226367 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226370 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226372 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226375 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226377 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226380 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226382 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226385 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226388 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226391 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226393 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226395 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226398 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226401 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226403 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:20.227315 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226407 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226410 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226413 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226415 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226418 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226421 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226423 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226426 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226429 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226432 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226434 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226438 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226440 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226442 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226445 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226448 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226450 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226454 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226457 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226459 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:20.227799 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226462 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226465 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226467 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226469 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226472 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226881 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226886 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226889 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226892 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226894 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226897 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226899 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226902 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226905 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226907 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226910 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226913 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226916 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226918 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226921 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:20.228277 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226924 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226926 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226930 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226933 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226936 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226938 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226941 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226943 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226946 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226949 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226952 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226955 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226958 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226960 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226963 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226967 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226970 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226973 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226976 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:20.228740 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226979 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226981 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226984 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226987 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226990 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226992 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226995 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.226997 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227000 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227002 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227004 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227007 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227009 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227012 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227015 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227017 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227019 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227022 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227025 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227027 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:20.229214 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227030 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227033 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227035 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227038 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227041 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227043 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227045 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227048 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227050 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227053 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227055 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227058 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227060 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227063 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227065 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227068 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227070 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227073 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227075 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:20.229691 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227078 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227080 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227083 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227085 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227088 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227090 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227092 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227094 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227097 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227099 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227101 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227105 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.227107 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227646 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227655 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227661 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227665 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227670 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227674 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227678 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227683 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:20.230134 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227686 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227689 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227692 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227696 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227699 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227702 2566 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227704 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227707 2566 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227711 2566 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227713 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227716 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227721 2566 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227723 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227727 2566 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227730 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227733 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227737 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227739 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227742 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227745 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227748 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227751 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227754 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227757 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227761 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:20.230653 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227766 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227769 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227772 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227775 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227778 2566 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227781 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227785 2566 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227788 2566 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227791 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227795 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227797 2566 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227801 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227804 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227807 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227810 2566 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227813 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227816 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227819 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227822 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227825 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227828 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227830 2566 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227834 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227837 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227840 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:20.231224 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227843 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227846 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227849 2566 flags.go:64] FLAG: --help="false" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227852 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227855 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227858 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227863 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227866 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227870 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227872 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227875 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227878 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227881 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227884 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227887 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227890 2566 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227893 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227896 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227898 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227901 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227904 2566 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227906 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227910 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227912 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:20.231803 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227918 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227920 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227923 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227926 2566 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227928 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227932 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227934 2566 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227937 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227941 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227944 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227948 2566 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227951 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227954 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227957 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227961 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227964 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227967 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227969 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227977 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227980 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227983 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227987 2566 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227989 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:20.232408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227994 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.227997 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228000 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228003 2566 flags.go:64] FLAG: --port="10250" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228006 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228009 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d7ac1863cc87eae6" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228013 2566 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228016 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228019 2566 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228021 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228024 2566 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228028 2566 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228031 2566 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228033 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228036 2566 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228040 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228043 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228045 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228048 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228051 2566 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228054 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228057 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228060 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228064 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228067 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228070 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:20.232946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228073 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228076 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228078 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228081 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228085 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228097 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228101 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228104 2566 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228107 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228112 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228115 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228117 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228122 2566 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228124 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228127 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228130 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228133 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228135 2566 flags.go:64] FLAG: --v="2" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228144 2566 flags.go:64] FLAG: --version="false" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228147 2566 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228152 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.228155 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228273 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228277 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228281 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:20.233556 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228284 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228286 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228290 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228293 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228296 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228299 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228302 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228304 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228307 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228309 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228312 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228316 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228318 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228321 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228324 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228327 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228329 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228332 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228335 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228337 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:20.234121 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228340 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228354 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228357 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228360 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228363 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228366 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228369 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228373 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228377 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228381 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228384 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228387 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228389 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228392 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228395 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228397 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228400 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228403 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228405 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:20.234621 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228407 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228410 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228412 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228415 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228419 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228421 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228424 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228427 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228429 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228432 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228435 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228437 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228439 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228442 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228445 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228447 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228450 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228453 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228455 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228457 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:20.235075 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228460 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228462 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228465 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228467 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228470 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228472 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228475 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228477 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228480 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228482 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228484 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228487 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228489 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228491 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228494 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228496 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228500 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228503 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228505 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:20.235599 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228508 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:20.236052 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228511 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:20.236052 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228513 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:20.236052 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228516 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:20.236052 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.228519 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:20.236052 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.229414 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:20.237753 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.237733 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:20.237788 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.237753 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237804 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237809 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237812 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237815 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237818 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237821 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:20.237821 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237823 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237827 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237829 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237832 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237835 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237838 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237841 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237844 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237846 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237850 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237852 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237855 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237857 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237860 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237863 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237865 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237868 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237870 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237873 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237875 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:20.237994 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237878 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237881 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237884 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237887 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237889 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237899 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237902 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237905 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237907 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237909 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237912 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237916 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237920 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237923 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237926 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237928 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237931 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237933 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237936 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:20.238482 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237938 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237940 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237943 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237945 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237948 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237950 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237953 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237955 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237957 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237960 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237963 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237965 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237968 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237970 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237973 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237975 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237978 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237980 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237983 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237988 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:20.238920 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237991 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237993 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.237997 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238001 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238004 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238007 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238009 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238012 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238014 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238017 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238020 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238022 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238025 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238028 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238030 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238033 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238035 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238037 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238040 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:20.239407 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238043 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238046 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.238051 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238159 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238164 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238167 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238170 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238173 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238175 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238178 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238181 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238183 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238186 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238188 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238191 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238193 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:20.239851 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238196 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238198 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238216 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238221 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238224 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238226 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238229 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238232 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238235 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238237 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238240 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238242 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238245 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238247 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238249 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238252 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238254 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238257 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238259 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238262 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:20.240231 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238264 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238267 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238269 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238272 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238274 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238276 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238279 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238282 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238284 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238286 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238290 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238292 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238295 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238297 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238300 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238303 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238305 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238307 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238310 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238312 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:20.240703 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238315 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238317 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238320 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238322 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238325 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238327 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238329 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238332 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238334 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238337 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238339 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238342 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238344 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238347 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238349 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238352 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238354 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238356 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238359 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:20.241194 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238362 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238364 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238367 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238369 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238372 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238374 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238377 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238379 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238382 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238384 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238386 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238389 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238391 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:20.238394 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.238398 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:20.241647 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.239339 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:20.242358 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.242344 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:20.243342 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.243331 2566 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:20.243443 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.243426 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:20.243476 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.243466 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:20.266338 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.266319 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:20.269874 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.269842 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:20.286790 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.286767 2566 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:20.292826 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.292795 2566 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:20.293267 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.293250 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:20.294159 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.294144 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:20.301246 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.301227 2566 fs.go:135] Filesystem UUIDs: map[3e21b025-da3c-44b7-b86d-8eb3fa9199a3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 aefb124a-aa18-4188-bd04-7285c5d3018c:/dev/nvme0n1p3] Apr 24 23:53:20.301308 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.301246 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:20.306889 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.306783 2566 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:20.30487593 +0000 UTC m=+0.434084199 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199884 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c969f9d0e4913f2d557bb6e66231e SystemUUID:ec2c969f-9d0e-4913-f2d5-57bb6e66231e BootID:6331826f-1246-4282-aaf1-12fef16aca32 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b2:6a:4f:47:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b2:6a:4f:47:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:9d:d4:b6:0e:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:20.306889 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.306884 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:20.306989 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.306965 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:20.308036 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308012 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:20.308191 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308038 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-62.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:20.308243 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308200 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:20.308243 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308220 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:20.308243 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308232 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:20.308824 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.308812 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:20.309542 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.309532 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:20.309645 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.309637 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:20.312083 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.312073 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:20.312137 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.312093 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:20.312137 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.312110 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:20.312137 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.312119 2566 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:20.312137 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.312128 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:20.313314 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.313302 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:20.313361 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.313320 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:20.316272 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.316257 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:20.317665 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.317652 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:20.319854 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319837 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:20.319854 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319855 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319862 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319871 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319879 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319885 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319891 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319896 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319903 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319909 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319929 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:20.319976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.319940 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:20.320912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.320900 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:20.320912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.320912 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:20.325378 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.325360 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:20.325493 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.325414 2566 server.go:1295] "Started kubelet" Apr 24 23:53:20.325810 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.325532 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:20.325885 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.325834 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-62.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:20.325936 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.325879 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:20.325936 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.325891 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-62.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:20.326031 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.325947 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:20.326081 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.326051 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:20.327693 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.327678 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:20.327855 ip-10-0-139-62 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:20.328738 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.328727 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:20.332328 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.331250 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-62.ec2.internal.18a9701d94b1488b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-62.ec2.internal,UID:ip-10-0-139-62.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-62.ec2.internal,},FirstTimestamp:2026-04-24 23:53:20.325380235 +0000 UTC m=+0.454588508,LastTimestamp:2026-04-24 23:53:20.325380235 +0000 UTC m=+0.454588508,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-62.ec2.internal,}" Apr 24 23:53:20.334511 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.334492 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:20.335070 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335053 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:20.335846 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335825 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:20.335846 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335847 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:20.335964 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335953 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:20.336022 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335988 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:20.336022 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.335997 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:20.336091 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336033 2566 factory.go:55] Registering systemd factory Apr 24 23:53:20.336091 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336089 2566 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:20.336197 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.336153 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.336334 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336319 2566 factory.go:153] Registering CRI-O factory Apr 24 23:53:20.336334 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336337 2566 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:20.336460 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336390 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:20.336460 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.336331 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:20.336460 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336415 2566 factory.go:103] Registering Raw factory Apr 24 23:53:20.336460 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336429 2566 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:20.336827 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.336813 2566 manager.go:319] Starting recovery of all containers Apr 24 23:53:20.338875 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.338852 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-92st6" Apr 24 23:53:20.340243 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.340218 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-62.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 23:53:20.340393 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.340371 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 23:53:20.346914 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.346738 2566 manager.go:324] Recovery completed Apr 24 23:53:20.348370 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.348328 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-92st6" Apr 24 23:53:20.352337 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.352323 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.354900 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.354886 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.354963 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.354916 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.354963 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.354926 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.355453 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.355440 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:20.355453 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.355449 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:20.355537 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.355464 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:20.356704 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.356620 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-62.ec2.internal.18a9701d9673b910 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-62.ec2.internal,UID:ip-10-0-139-62.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-62.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-62.ec2.internal,},FirstTimestamp:2026-04-24 23:53:20.35490024 +0000 UTC m=+0.484108517,LastTimestamp:2026-04-24 23:53:20.35490024 +0000 UTC m=+0.484108517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-62.ec2.internal,}" Apr 24 23:53:20.359172 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.359161 2566 policy_none.go:49] "None policy: Start" Apr 24 23:53:20.359228 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.359176 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:20.359228 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.359185 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.396802 2566 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.396845 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.396860 2566 server.go:85] "Starting device plugin registration server" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397110 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397122 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397246 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397336 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397346 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.397533 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.399231 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.399259 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.399278 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.399286 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.399325 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.399549 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.399588 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.414164 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.402547 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:20.497476 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.497402 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.498568 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.498551 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.498652 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.498582 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.498652 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.498592 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.498652 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.498619 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.499646 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.499629 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal"] Apr 24 23:53:20.499700 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.499692 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.500468 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.500455 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.500530 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.500480 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.500530 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.500491 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.502779 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.502766 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.502899 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.502883 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.502932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.502915 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.503429 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503415 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.503513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503441 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.503513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503415 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.503513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503453 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.503513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503467 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.503513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.503480 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.505718 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.505701 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.505826 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.505725 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:20.506375 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.506360 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:20.506457 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.506382 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:20.506457 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.506395 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:20.508012 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.507997 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.508099 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.508020 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-62.ec2.internal\": node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.519265 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.519242 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.533166 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.533140 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-62.ec2.internal\" not found" node="ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.537546 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.537524 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-62.ec2.internal\" not found" node="ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.619422 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.619393 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.636881 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.636855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68d8ca4751462a8913290f68bba7fb20-config\") pod \"kube-apiserver-proxy-ip-10-0-139-62.ec2.internal\" (UID: \"68d8ca4751462a8913290f68bba7fb20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.636977 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.636888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.636977 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.636916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.720000 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.719960 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.737382 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737346 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68d8ca4751462a8913290f68bba7fb20-config\") pod \"kube-apiserver-proxy-ip-10-0-139-62.ec2.internal\" (UID: \"68d8ca4751462a8913290f68bba7fb20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.737479 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737366 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68d8ca4751462a8913290f68bba7fb20-config\") pod \"kube-apiserver-proxy-ip-10-0-139-62.ec2.internal\" (UID: \"68d8ca4751462a8913290f68bba7fb20\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.737479 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737429 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.737479 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737462 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.737593 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737502 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.737593 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.737506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc283eaa290b1f4b05532791f305128c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal\" (UID: \"fc283eaa290b1f4b05532791f305128c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.820774 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.820706 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:20.835882 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.835864 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.841787 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:20.841769 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:20.921038 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:20.920986 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.021518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.021476 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.122152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.122079 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.222644 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.222602 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.243220 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.243167 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:21.243369 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.243346 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:21.322758 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.322723 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.335578 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.335552 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:21.348879 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.348853 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:21.351731 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.351696 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:20 +0000 UTC" deadline="2027-10-31 19:11:44.178756117 +0000 UTC" Apr 24 23:53:21.351731 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.351729 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13315h18m22.827029908s" Apr 24 23:53:21.369319 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.369295 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9fqbv" Apr 24 23:53:21.376683 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.376633 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9fqbv" Apr 24 23:53:21.423246 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.423216 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.500870 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:21.500836 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68d8ca4751462a8913290f68bba7fb20.slice/crio-09b3e02344a8698414cd6845fd936de3090ceb2454c7a38ed2770aa11f692443 WatchSource:0}: Error finding container 09b3e02344a8698414cd6845fd936de3090ceb2454c7a38ed2770aa11f692443: Status 404 returned error can't find the container with id 09b3e02344a8698414cd6845fd936de3090ceb2454c7a38ed2770aa11f692443 Apr 24 23:53:21.501401 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:21.501381 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc283eaa290b1f4b05532791f305128c.slice/crio-3e2a2e9e3e94bcb1cffe0e9a1284308ed4812d2564c745e1ec4a5a0c04d15d82 WatchSource:0}: Error finding container 3e2a2e9e3e94bcb1cffe0e9a1284308ed4812d2564c745e1ec4a5a0c04d15d82: Status 404 returned error can't find the container with id 3e2a2e9e3e94bcb1cffe0e9a1284308ed4812d2564c745e1ec4a5a0c04d15d82 Apr 24 23:53:21.506501 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.506067 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:21.523866 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:21.523839 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-62.ec2.internal\" not found" Apr 24 23:53:21.593795 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.593770 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:21.635963 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.635880 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" Apr 24 23:53:21.649977 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.649954 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:21.651757 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.651740 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" Apr 24 23:53:21.659685 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.659660 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:21.713372 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.713339 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:21.779684 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:21.779657 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:22.313592 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.313562 2566 apiserver.go:52] "Watching apiserver" Apr 24 23:53:22.323291 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.323265 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:22.323777 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.323751 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bx6k8","kube-system/konnectivity-agent-xfncr","openshift-dns/node-resolver-jgjtn","openshift-image-registry/node-ca-64t8l","openshift-multus/multus-additional-cni-plugins-pcfrr","openshift-multus/multus-g4bsj","openshift-multus/network-metrics-daemon-c6pqs","openshift-network-operator/iptables-alerter-2cb6b","kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57","openshift-cluster-node-tuning-operator/tuned-skn6p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal","openshift-network-diagnostics/network-check-target-vq8nz","openshift-ovn-kubernetes/ovnkube-node-tfv9v"] Apr 24 23:53:22.326903 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.326861 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.327052 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.326956 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:22.329091 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.328983 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:22.329091 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.329056 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:22.333501 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.333291 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.333501 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.333368 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.335777 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.335758 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:22.337374 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.335988 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qc7t9\"" Apr 24 23:53:22.337374 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.336830 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rgfxr\"" Apr 24 23:53:22.337374 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.336990 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.337872 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.337856 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.338050 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.338002 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.338050 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.338037 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.338775 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.338481 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.340808 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.340792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.341334 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341316 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wf2j9\"" Apr 24 23:53:22.341423 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341382 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.341517 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341500 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:22.341517 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341319 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:22.341665 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341529 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.341665 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.341593 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:22.342741 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.342708 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:22.343145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.343127 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pcwkl\"" Apr 24 23:53:22.346006 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.345485 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.346526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346433 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5g2\" (UniqueName: \"kubernetes.io/projected/7fd21580-2e57-4cb2-8470-18fa0629553c-kube-api-access-5k5g2\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.346526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346469 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346495 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7f6k\" (UniqueName: \"kubernetes.io/projected/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-kube-api-access-c7f6k\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-tmp-dir\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.346725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-system-cni-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346638 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cnibin\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346664 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346739 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7fd21580-2e57-4cb2-8470-18fa0629553c-serviceca\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9x2\" (UniqueName: \"kubernetes.io/projected/f9f062da-f1a8-4e5a-ac2f-ad672791353b-kube-api-access-fh9x2\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346845 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkm6z\" (UniqueName: \"kubernetes.io/projected/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-kube-api-access-zkm6z\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346878 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd21580-2e57-4cb2-8470-18fa0629553c-host\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.346955 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-os-release\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.347227 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.346999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.347227 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.347030 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.347227 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.347054 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-hosts-file\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.347745 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.347730 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:22.348120 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.348101 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.348345 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.348330 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.348650 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.348629 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vpkzx\"" Apr 24 23:53:22.349155 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.349138 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:22.349383 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.349368 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.349466 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.349450 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:22.350107 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.349989 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:22.350302 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.350288 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:22.351055 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.351034 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.351157 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.351106 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:22.351230 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.351153 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.352442 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.352419 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:22.353514 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.353416 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-w78bb\"" Apr 24 23:53:22.356012 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.355958 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:22.356381 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.356363 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.356501 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.356437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rx27f\"" Apr 24 23:53:22.356565 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.356446 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.357613 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.357230 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.359530 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.359508 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.359774 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.359758 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8j9vs\"" Apr 24 23:53:22.359774 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.359766 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.359903 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.359836 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.362378 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.362359 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gfckq\"" Apr 24 23:53:22.362455 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.362420 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:22.362601 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.362567 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:22.363141 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.363123 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:22.377305 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.377280 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:21 +0000 UTC" deadline="2028-01-10 10:45:34.170594116 +0000 UTC" Apr 24 23:53:22.377305 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.377304 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15010h52m11.793293115s" Apr 24 23:53:22.404755 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.404690 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" event={"ID":"fc283eaa290b1f4b05532791f305128c","Type":"ContainerStarted","Data":"3e2a2e9e3e94bcb1cffe0e9a1284308ed4812d2564c745e1ec4a5a0c04d15d82"} Apr 24 23:53:22.405687 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.405666 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" event={"ID":"68d8ca4751462a8913290f68bba7fb20","Type":"ContainerStarted","Data":"09b3e02344a8698414cd6845fd936de3090ceb2454c7a38ed2770aa11f692443"} Apr 24 23:53:22.437780 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.437748 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:22.448259 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448219 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-ovn\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.448259 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448256 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-script-lib\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448291 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-system-cni-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448315 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-etc-kubernetes\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448350 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-system-cni-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448341 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-socket-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448423 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-device-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.448486 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448466 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6g6q\" (UniqueName: \"kubernetes.io/projected/f63c8907-4f05-4332-84e3-9ca9c74f643c-kube-api-access-x6g6q\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448495 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448529 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-modprobe-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448575 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-systemd\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448597 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-host\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448646 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-hosts-file\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-hosts-file\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-sys-fs\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448761 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-kubernetes\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448798 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-systemd-units\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.448836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e19a4fe-f15a-4907-a972-471635868ded-iptables-alerter-script\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448863 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448905 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5g2\" (UniqueName: \"kubernetes.io/projected/7fd21580-2e57-4cb2-8470-18fa0629553c-kube-api-access-5k5g2\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.448937 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-system-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-dbus\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449122 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-sys\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449157 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-lib-modules\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449184 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-netns\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449232 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-os-release\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449268 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-config\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7fd21580-2e57-4cb2-8470-18fa0629553c-serviceca\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449335 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkm6z\" (UniqueName: \"kubernetes.io/projected/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-kube-api-access-zkm6z\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449367 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cni-binary-copy\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449392 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-socket-dir-parent\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449418 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-k8s-cni-cncf-io\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-etc-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449488 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-node-log\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449515 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449543 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-os-release\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449598 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-conf-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-bin\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449647 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-os-release\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449651 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovn-node-metrics-cert\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449686 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e19a4fe-f15a-4907-a972-471635868ded-host-slash\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449710 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.449728 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449734 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-var-lib-kubelet\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449736 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7fd21580-2e57-4cb2-8470-18fa0629553c-serviceca\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.449867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-tuned\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.449797 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:22.949772664 +0000 UTC m=+3.078980935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449811 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9x2\" (UniqueName: \"kubernetes.io/projected/f9f062da-f1a8-4e5a-ac2f-ad672791353b-kube-api-access-fh9x2\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449924 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449948 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-netns\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-multus-certs\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449981 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-run\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.449995 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-netd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450009 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th674\" (UniqueName: \"kubernetes.io/projected/2e19a4fe-f15a-4907-a972-471635868ded-kube-api-access-th674\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450025 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450067 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cnibin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-multus\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450137 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-tmp\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450158 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-var-lib-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450184 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.450585 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7f6k\" (UniqueName: \"kubernetes.io/projected/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-kube-api-access-c7f6k\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450261 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrp5\" (UniqueName: \"kubernetes.io/projected/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-kube-api-access-gnrp5\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450285 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450309 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cnibin\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450426 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450453 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5de5dff9-24ac-4c52-a324-20a9923ea60b-agent-certs\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-systemd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450505 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-log-socket\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450574 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-daemon-config\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450598 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450609 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450599 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76g5b\" (UniqueName: \"kubernetes.io/projected/971d2d30-8f97-4832-a49b-14a3877e3eb3-kube-api-access-76g5b\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-conf\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450727 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.451326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-cnibin\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450835 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-slash\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450932 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-env-overrides\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd21580-2e57-4cb2-8470-18fa0629553c-host\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.450979 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-bin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451030 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-kubelet\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451056 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451077 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fd21580-2e57-4cb2-8470-18fa0629553c-host\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451082 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5de5dff9-24ac-4c52-a324-20a9923ea60b-konnectivity-ca\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451112 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-registration-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451134 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysconfig\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451160 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57zfn\" (UniqueName: \"kubernetes.io/projected/d0fe631a-be83-446b-90d8-57f1d40d01e3-kube-api-access-57zfn\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451183 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-kubelet\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451226 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-tmp-dir\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451251 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-hostroot\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451274 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-kubelet-config\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.452145 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.451515 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-tmp-dir\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.456525 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.456488 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:22.456525 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.456518 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:22.456698 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.456532 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:22.456698 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.456613 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:22.95659536 +0000 UTC m=+3.085803632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:22.456826 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.456786 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:22.460552 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.460523 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7f6k\" (UniqueName: \"kubernetes.io/projected/2aa55bd4-e281-4226-85cb-d9aa2ce0bd34-kube-api-access-c7f6k\") pod \"multus-additional-cni-plugins-pcfrr\" (UID: \"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34\") " pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.460656 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.460558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9x2\" (UniqueName: \"kubernetes.io/projected/f9f062da-f1a8-4e5a-ac2f-ad672791353b-kube-api-access-fh9x2\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.460656 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.460528 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkm6z\" (UniqueName: \"kubernetes.io/projected/3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2-kube-api-access-zkm6z\") pod \"node-resolver-jgjtn\" (UID: \"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2\") " pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.460656 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.460592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5g2\" (UniqueName: \"kubernetes.io/projected/7fd21580-2e57-4cb2-8470-18fa0629553c-kube-api-access-5k5g2\") pod \"node-ca-64t8l\" (UID: \"7fd21580-2e57-4cb2-8470-18fa0629553c\") " pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.552535 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552494 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.552535 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552539 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-var-lib-kubelet\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552566 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-tuned\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552595 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552635 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-netns\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552646 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-multus-certs\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552691 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-multus-certs\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-var-lib-kubelet\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552714 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-run\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.552763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-netd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552774 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th674\" (UniqueName: \"kubernetes.io/projected/2e19a4fe-f15a-4907-a972-471635868ded-kube-api-access-th674\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-netns\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cnibin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552827 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-multus\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-run\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552849 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-tmp\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cnibin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-multus\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.552975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-netd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-var-lib-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553105 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrp5\" (UniqueName: \"kubernetes.io/projected/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-kube-api-access-gnrp5\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553184 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5de5dff9-24ac-4c52-a324-20a9923ea60b-agent-certs\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553188 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-var-lib-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553222 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-systemd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-log-socket\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553269 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-daemon-config\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553307 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76g5b\" (UniqueName: \"kubernetes.io/projected/971d2d30-8f97-4832-a49b-14a3877e3eb3-kube-api-access-76g5b\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.553323 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-conf\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.553372 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:23.053354114 +0000 UTC m=+3.182562386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-systemd\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-log-socket\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553469 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-slash\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-env-overrides\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553501 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-slash\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553468 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-conf\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-bin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-kubelet\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553562 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5de5dff9-24ac-4c52-a324-20a9923ea60b-konnectivity-ca\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.553912 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553562 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-cni-bin\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-var-lib-kubelet\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-registration-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553620 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysconfig\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57zfn\" (UniqueName: \"kubernetes.io/projected/d0fe631a-be83-446b-90d8-57f1d40d01e3-kube-api-access-57zfn\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-kubelet\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-hostroot\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-kubelet-config\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553786 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-ovn\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553811 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-script-lib\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-etc-kubernetes\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553837 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-daemon-config\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553861 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-socket-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553901 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-hostroot\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-kubelet-config\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-registration-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.553979 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-env-overrides\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.554692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysconfig\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554014 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-socket-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-device-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554053 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-kubelet\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554072 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6g6q\" (UniqueName: \"kubernetes.io/projected/f63c8907-4f05-4332-84e3-9ca9c74f643c-kube-api-access-x6g6q\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-etc-kubernetes\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554113 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5de5dff9-24ac-4c52-a324-20a9923ea60b-konnectivity-ca\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-modprobe-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554147 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554172 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-systemd\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-host\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554240 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-sys-fs\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-kubernetes\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-systemd-units\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554313 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e19a4fe-f15a-4907-a972-471635868ded-iptables-alerter-script\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554330 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-device-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.555470 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-system-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554393 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-run-ovn\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554405 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-dbus\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554448 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-host\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554476 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-sys\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-lib-modules\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554521 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-dbus\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554527 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-script-lib\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554556 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-sys-fs\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554535 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-netns\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554569 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-system-cni-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554382 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-os-release\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554600 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-sys\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554604 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-systemd\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554615 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-config\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554646 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cni-binary-copy\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554651 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-modprobe-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-socket-dir-parent\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554694 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-lib-modules\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554512 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/971d2d30-8f97-4832-a49b-14a3877e3eb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554708 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-kubernetes\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554563 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-netns\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554708 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-sysctl-d\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-k8s-cni-cncf-io\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-host-run-k8s-cni-cncf-io\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554786 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-os-release\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555238 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-cni-binary-copy\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.554878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-etc-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555302 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovnkube-config\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-etc-tuned\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-socket-dir-parent\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-systemd-units\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-node-log\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555278 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e19a4fe-f15a-4907-a972-471635868ded-iptables-alerter-script\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.556772 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555403 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-etc-openvswitch\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555451 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-node-log\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555492 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-conf-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555625 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-multus-conf-dir\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-bin\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555692 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovn-node-metrics-cert\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe631a-be83-446b-90d8-57f1d40d01e3-tmp\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555718 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f63c8907-4f05-4332-84e3-9ca9c74f643c-host-cni-bin\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555760 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e19a4fe-f15a-4907-a972-471635868ded-host-slash\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555861 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e19a4fe-f15a-4907-a972-471635868ded-host-slash\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.557458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.555908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5de5dff9-24ac-4c52-a324-20a9923ea60b-agent-certs\") pod \"konnectivity-agent-xfncr\" (UID: \"5de5dff9-24ac-4c52-a324-20a9923ea60b\") " pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.557827 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.557753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f63c8907-4f05-4332-84e3-9ca9c74f643c-ovn-node-metrics-cert\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.562306 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.562283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76g5b\" (UniqueName: \"kubernetes.io/projected/971d2d30-8f97-4832-a49b-14a3877e3eb3-kube-api-access-76g5b\") pod \"aws-ebs-csi-driver-node-jwk57\" (UID: \"971d2d30-8f97-4832-a49b-14a3877e3eb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.563076 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.562954 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th674\" (UniqueName: \"kubernetes.io/projected/2e19a4fe-f15a-4907-a972-471635868ded-kube-api-access-th674\") pod \"iptables-alerter-2cb6b\" (UID: \"2e19a4fe-f15a-4907-a972-471635868ded\") " pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.563076 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.563045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrp5\" (UniqueName: \"kubernetes.io/projected/f7d067fa-72fb-42f4-92b9-edee24d3ed1e-kube-api-access-gnrp5\") pod \"multus-g4bsj\" (UID: \"f7d067fa-72fb-42f4-92b9-edee24d3ed1e\") " pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.563663 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.563614 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6g6q\" (UniqueName: \"kubernetes.io/projected/f63c8907-4f05-4332-84e3-9ca9c74f643c-kube-api-access-x6g6q\") pod \"ovnkube-node-tfv9v\" (UID: \"f63c8907-4f05-4332-84e3-9ca9c74f643c\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.563663 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.563649 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57zfn\" (UniqueName: \"kubernetes.io/projected/d0fe631a-be83-446b-90d8-57f1d40d01e3-kube-api-access-57zfn\") pod \"tuned-skn6p\" (UID: \"d0fe631a-be83-446b-90d8-57f1d40d01e3\") " pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.585966 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.585939 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:22.650094 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.650056 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jgjtn" Apr 24 23:53:22.659999 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.659971 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-64t8l" Apr 24 23:53:22.668654 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.668634 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" Apr 24 23:53:22.675326 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.675308 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g4bsj" Apr 24 23:53:22.681942 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.681924 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:22.688449 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.688433 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:22.696059 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.696040 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" Apr 24 23:53:22.702649 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.702630 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-skn6p" Apr 24 23:53:22.711237 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.711200 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2cb6b" Apr 24 23:53:22.959491 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.959414 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:22.959647 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:22.959504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:22.959647 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959582 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:22.959647 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959592 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:22.959647 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959601 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:22.959647 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959613 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:22.959843 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959656 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:23.959636972 +0000 UTC m=+4.088845255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:22.959843 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:22.959676 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:23.959667247 +0000 UTC m=+4.088875505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:23.060532 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.060496 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:23.060701 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.060657 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:23.060764 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.060721 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:24.06070683 +0000 UTC m=+4.189915092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:23.351279 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.351244 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e19a4fe_f15a_4907_a972_471635868ded.slice/crio-84a5bbf9936beef89283134b2827dc119a3213ff349bba72adf570a2fbd39df4 WatchSource:0}: Error finding container 84a5bbf9936beef89283134b2827dc119a3213ff349bba72adf570a2fbd39df4: Status 404 returned error can't find the container with id 84a5bbf9936beef89283134b2827dc119a3213ff349bba72adf570a2fbd39df4 Apr 24 23:53:23.352546 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.352514 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d067fa_72fb_42f4_92b9_edee24d3ed1e.slice/crio-8dd75bd23c7a00462c4b76d95df789fc23f35554f681356c3bf48af4ec3a5415 WatchSource:0}: Error finding container 8dd75bd23c7a00462c4b76d95df789fc23f35554f681356c3bf48af4ec3a5415: Status 404 returned error can't find the container with id 8dd75bd23c7a00462c4b76d95df789fc23f35554f681356c3bf48af4ec3a5415 Apr 24 23:53:23.354244 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.354220 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de5dff9_24ac_4c52_a324_20a9923ea60b.slice/crio-5752fc7695246ae927d906e1f3c1cdcccf4032f6c938be1d51c6d015775c8ac3 WatchSource:0}: Error finding container 5752fc7695246ae927d906e1f3c1cdcccf4032f6c938be1d51c6d015775c8ac3: Status 404 returned error can't find the container with id 5752fc7695246ae927d906e1f3c1cdcccf4032f6c938be1d51c6d015775c8ac3 Apr 24 23:53:23.354829 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.354805 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa55bd4_e281_4226_85cb_d9aa2ce0bd34.slice/crio-5c27e85d91bfc4d226c782b015ba6a80aeb6e418b0c3ff6f5926b0de07e958e2 WatchSource:0}: Error finding container 5c27e85d91bfc4d226c782b015ba6a80aeb6e418b0c3ff6f5926b0de07e958e2: Status 404 returned error can't find the container with id 5c27e85d91bfc4d226c782b015ba6a80aeb6e418b0c3ff6f5926b0de07e958e2 Apr 24 23:53:23.358399 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.358375 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a61a51c_e12c_4ab5_ac3f_b0d6b58d6ea2.slice/crio-18e38c8c19e3da465f29ae0a4f0e3691745149c061c162a8556ff9fbbfc8d946 WatchSource:0}: Error finding container 18e38c8c19e3da465f29ae0a4f0e3691745149c061c162a8556ff9fbbfc8d946: Status 404 returned error can't find the container with id 18e38c8c19e3da465f29ae0a4f0e3691745149c061c162a8556ff9fbbfc8d946 Apr 24 23:53:23.362540 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.362513 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971d2d30_8f97_4832_a49b_14a3877e3eb3.slice/crio-6257276858d8d937cbf5f64838cbef3e4aec3d9a628c76c24c81328aa38cd0b5 WatchSource:0}: Error finding container 6257276858d8d937cbf5f64838cbef3e4aec3d9a628c76c24c81328aa38cd0b5: Status 404 returned error can't find the container with id 6257276858d8d937cbf5f64838cbef3e4aec3d9a628c76c24c81328aa38cd0b5 Apr 24 23:53:23.367294 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:23.367267 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd21580_2e57_4cb2_8470_18fa0629553c.slice/crio-6e0e17b110499dd5128260e56514ad38025cd4f9e96155a28e9ad680e0969fb4 WatchSource:0}: Error finding container 6e0e17b110499dd5128260e56514ad38025cd4f9e96155a28e9ad680e0969fb4: Status 404 returned error can't find the container with id 6e0e17b110499dd5128260e56514ad38025cd4f9e96155a28e9ad680e0969fb4 Apr 24 23:53:23.378045 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.378013 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:21 +0000 UTC" deadline="2027-11-21 18:36:53.516145874 +0000 UTC" Apr 24 23:53:23.378045 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.378043 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13818h43m30.138106848s" Apr 24 23:53:23.400403 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.400274 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:23.400485 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.400463 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:23.408382 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.408362 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-skn6p" event={"ID":"d0fe631a-be83-446b-90d8-57f1d40d01e3","Type":"ContainerStarted","Data":"c99a660d58eb335af8422d6e5351e58b95da74e744daee4f416214a45edce449"} Apr 24 23:53:23.409465 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.409429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"46c2c95113c8aaf580652937611db83eb5cc94a225db329144b0471ce6770dc3"} Apr 24 23:53:23.411045 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.411019 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jgjtn" event={"ID":"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2","Type":"ContainerStarted","Data":"18e38c8c19e3da465f29ae0a4f0e3691745149c061c162a8556ff9fbbfc8d946"} Apr 24 23:53:23.413017 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.412991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerStarted","Data":"5c27e85d91bfc4d226c782b015ba6a80aeb6e418b0c3ff6f5926b0de07e958e2"} Apr 24 23:53:23.414490 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.414466 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xfncr" event={"ID":"5de5dff9-24ac-4c52-a324-20a9923ea60b","Type":"ContainerStarted","Data":"5752fc7695246ae927d906e1f3c1cdcccf4032f6c938be1d51c6d015775c8ac3"} Apr 24 23:53:23.415438 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.415421 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-64t8l" event={"ID":"7fd21580-2e57-4cb2-8470-18fa0629553c","Type":"ContainerStarted","Data":"6e0e17b110499dd5128260e56514ad38025cd4f9e96155a28e9ad680e0969fb4"} Apr 24 23:53:23.416412 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.416378 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" event={"ID":"971d2d30-8f97-4832-a49b-14a3877e3eb3","Type":"ContainerStarted","Data":"6257276858d8d937cbf5f64838cbef3e4aec3d9a628c76c24c81328aa38cd0b5"} Apr 24 23:53:23.417315 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.417281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g4bsj" event={"ID":"f7d067fa-72fb-42f4-92b9-edee24d3ed1e","Type":"ContainerStarted","Data":"8dd75bd23c7a00462c4b76d95df789fc23f35554f681356c3bf48af4ec3a5415"} Apr 24 23:53:23.418258 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.418235 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2cb6b" event={"ID":"2e19a4fe-f15a-4907-a972-471635868ded","Type":"ContainerStarted","Data":"84a5bbf9936beef89283134b2827dc119a3213ff349bba72adf570a2fbd39df4"} Apr 24 23:53:23.968859 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.968824 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:23.969080 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:23.968875 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:23.969080 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969029 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:23.969080 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969045 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:23.969080 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969057 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:23.969312 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969108 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:25.969090808 +0000 UTC m=+6.098299072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:23.969523 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969505 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:23.969591 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:23.969560 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:25.969544271 +0000 UTC m=+6.098752544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:24.069948 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:24.069410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:24.069948 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:24.069564 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:24.069948 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:24.069618 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:26.06960192 +0000 UTC m=+6.198810182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:24.402913 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:24.402884 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:24.403308 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:24.403020 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:24.403482 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:24.403463 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:24.403770 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:24.403564 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:24.433088 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:24.433046 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" event={"ID":"68d8ca4751462a8913290f68bba7fb20","Type":"ContainerStarted","Data":"a207fccbd22cbaa5f9e72097287ff6fbca9251227042c9dfb89500b7df359656"} Apr 24 23:53:25.400904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.400414 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:25.400904 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.400537 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:25.459895 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.459851 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc283eaa290b1f4b05532791f305128c" containerID="9d1f6e41e3b973680b1478b769cdc02164c6fdd301204c33bbc9dbbc2e49b6b8" exitCode=0 Apr 24 23:53:25.460352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.460331 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" event={"ID":"fc283eaa290b1f4b05532791f305128c","Type":"ContainerDied","Data":"9d1f6e41e3b973680b1478b769cdc02164c6fdd301204c33bbc9dbbc2e49b6b8"} Apr 24 23:53:25.473820 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.473768 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-62.ec2.internal" podStartSLOduration=4.473748048 podStartE2EDuration="4.473748048s" podCreationTimestamp="2026-04-24 23:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:24.447609095 +0000 UTC m=+4.576817376" watchObservedRunningTime="2026-04-24 23:53:25.473748048 +0000 UTC m=+5.602956329" Apr 24 23:53:25.984441 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.984349 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:25.984607 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:25.984509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:25.984607 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984532 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:25.984607 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984561 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:25.984607 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984576 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:25.984815 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984616 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:25.984815 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984642 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:29.984624098 +0000 UTC m=+10.113832375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:25.984815 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:25.984671 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:29.984653512 +0000 UTC m=+10.113861777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:26.085237 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:26.085173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:26.085407 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:26.085331 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:26.085407 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:26.085394 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:30.08537711 +0000 UTC m=+10.214585382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:26.399638 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:26.399557 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:26.399801 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:26.399713 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:26.399801 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:26.399773 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:26.399903 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:26.399876 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:26.464792 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:26.464728 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" event={"ID":"fc283eaa290b1f4b05532791f305128c","Type":"ContainerStarted","Data":"d98d08ec7de85bafcfa4a846089a9dfe74ab46e10a6e99a78dea635caa358474"} Apr 24 23:53:27.400478 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:27.400444 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:27.400665 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:27.400578 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:28.400079 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:28.400044 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:28.400527 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:28.400184 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:28.400719 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:28.400695 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:28.400888 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:28.400861 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:29.400234 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:29.400186 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:29.400708 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:29.400326 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:30.018860 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:30.018825 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:30.019041 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:30.018869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:30.019041 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019026 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:30.019152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019043 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:30.019152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019061 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:30.019152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019075 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:30.019152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019093 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.019077097 +0000 UTC m=+18.148285359 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:30.019152 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.019124 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.019108483 +0000 UTC m=+18.148316757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:30.120321 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:30.119723 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:30.120321 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.119878 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:30.120321 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.119937 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:38.119920296 +0000 UTC m=+18.249128557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:30.400637 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:30.400147 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:30.400637 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.400270 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:30.400637 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:30.400586 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:30.401160 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:30.400685 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:31.400101 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:31.400065 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:31.400318 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:31.400196 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:32.400072 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:32.399820 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:32.400072 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:32.399820 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:32.400072 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:32.399970 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:32.400072 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:32.400043 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:33.399672 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:33.399640 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:33.399851 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:33.399787 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:34.399881 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:34.399844 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:34.400333 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:34.399956 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:34.400333 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:34.400025 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:34.400333 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:34.400153 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:35.399805 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:35.399768 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:35.399968 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:35.399878 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:36.399516 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:36.399478 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:36.399698 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:36.399588 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:36.399698 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:36.399651 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:36.399803 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:36.399777 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:37.400221 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:37.400127 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:37.400654 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:37.400266 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:38.073788 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:38.073748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:38.073935 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:38.073801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:38.073935 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.073897 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.073935 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.073928 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:38.074066 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.073945 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:38.074066 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.073957 2566 projected.go:194] Error preparing data for projected volume kube-api-access-qw76w for pod openshift-network-diagnostics/network-check-target-vq8nz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.074066 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.073977 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.07395508 +0000 UTC m=+34.203163341 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:38.074066 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.074002 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w podName:97ecc28e-c411-4b57-86a8-d793acbd08ad nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.073987887 +0000 UTC m=+34.203196158 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qw76w" (UniqueName: "kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w") pod "network-check-target-vq8nz" (UID: "97ecc28e-c411-4b57-86a8-d793acbd08ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:38.174373 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:38.174342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:38.174544 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.174460 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.174544 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.174526 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.174508362 +0000 UTC m=+34.303716639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:38.399802 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:38.399730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:38.399936 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.399856 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:38.399936 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:38.399915 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:38.400053 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:38.400032 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:39.399668 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:39.399629 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:39.400088 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:39.399756 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:40.400737 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.400532 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:40.401337 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.400635 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:40.401337 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:40.400842 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:40.401337 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:40.400913 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:40.488489 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.488289 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" event={"ID":"971d2d30-8f97-4832-a49b-14a3877e3eb3","Type":"ContainerStarted","Data":"4aca1c96187b14d2bc6f597a8993c9888e2b928db211b2b516df62f4398fe6b6"} Apr 24 23:53:40.489568 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.489537 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g4bsj" event={"ID":"f7d067fa-72fb-42f4-92b9-edee24d3ed1e","Type":"ContainerStarted","Data":"33b3d4a23ce87bb62cc01e3357eec49ffcb2388ac66f7eba52c86963df46c879"} Apr 24 23:53:40.490633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.490610 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-skn6p" event={"ID":"d0fe631a-be83-446b-90d8-57f1d40d01e3","Type":"ContainerStarted","Data":"59c87df97cd80d5177061a6d74177e0e886ae9343c5c827574de97e1201ef193"} Apr 24 23:53:40.492052 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.492034 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:53:40.492321 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.492302 2566 generic.go:358] "Generic (PLEG): container finished" podID="f63c8907-4f05-4332-84e3-9ca9c74f643c" containerID="d14062444a2e3935dff05ea08920b406a350c33cd9480977a0a6c4d02b90683a" exitCode=1 Apr 24 23:53:40.492396 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.492364 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"ba3f90d4795786aa84dccac1248ddf1707de3a5e1cf58a601c930ba31d38a62d"} Apr 24 23:53:40.492396 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.492387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerDied","Data":"d14062444a2e3935dff05ea08920b406a350c33cd9480977a0a6c4d02b90683a"} Apr 24 23:53:40.492483 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.492399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"0d1fd323f2494f97f59535a334aabda5545bd99d37b53cbba860db946ab5e55a"} Apr 24 23:53:40.493447 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.493426 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jgjtn" event={"ID":"3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2","Type":"ContainerStarted","Data":"c92776f7acb1a630a6ec9eb9364be2b6cb1a80fd2f2e116de6c967fa11ed921b"} Apr 24 23:53:40.494667 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.494645 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="e46b6a78cd99d54adbbd4c84fd5f0499825efad1f36e22c10cbfb0400f6bd2a8" exitCode=0 Apr 24 23:53:40.494750 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.494717 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"e46b6a78cd99d54adbbd4c84fd5f0499825efad1f36e22c10cbfb0400f6bd2a8"} Apr 24 23:53:40.495897 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.495864 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xfncr" event={"ID":"5de5dff9-24ac-4c52-a324-20a9923ea60b","Type":"ContainerStarted","Data":"77d7571324bbfedfcf2cd0bf6130f3e040b0da624fa5214548525ee35b5f8279"} Apr 24 23:53:40.496924 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.496903 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-64t8l" event={"ID":"7fd21580-2e57-4cb2-8470-18fa0629553c","Type":"ContainerStarted","Data":"b31c22506ed46e107879db6b1b3f680630195ace27f0c3f7c2317a295eae50ed"} Apr 24 23:53:40.505552 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.505522 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-62.ec2.internal" podStartSLOduration=19.505511895 podStartE2EDuration="19.505511895s" podCreationTimestamp="2026-04-24 23:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:26.47714088 +0000 UTC m=+6.606349159" watchObservedRunningTime="2026-04-24 23:53:40.505511895 +0000 UTC m=+20.634720174" Apr 24 23:53:40.505968 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.505943 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g4bsj" podStartSLOduration=3.90842788 podStartE2EDuration="20.505935709s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.355306212 +0000 UTC m=+3.484514478" lastFinishedPulling="2026-04-24 23:53:39.952814036 +0000 UTC m=+20.082022307" observedRunningTime="2026-04-24 23:53:40.50537685 +0000 UTC m=+20.634585130" watchObservedRunningTime="2026-04-24 23:53:40.505935709 +0000 UTC m=+20.635143987" Apr 24 23:53:40.517027 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.516992 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jgjtn" podStartSLOduration=3.9552904719999997 podStartE2EDuration="20.516982996s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.360187289 +0000 UTC m=+3.489395549" lastFinishedPulling="2026-04-24 23:53:39.921879802 +0000 UTC m=+20.051088073" observedRunningTime="2026-04-24 23:53:40.516913258 +0000 UTC m=+20.646121536" watchObservedRunningTime="2026-04-24 23:53:40.516982996 +0000 UTC m=+20.646191273" Apr 24 23:53:40.544075 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.544034 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-skn6p" podStartSLOduration=3.987600533 podStartE2EDuration="20.544019595s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.367031482 +0000 UTC m=+3.496239745" lastFinishedPulling="2026-04-24 23:53:39.923450537 +0000 UTC m=+20.052658807" observedRunningTime="2026-04-24 23:53:40.543856111 +0000 UTC m=+20.673064389" watchObservedRunningTime="2026-04-24 23:53:40.544019595 +0000 UTC m=+20.673227873" Apr 24 23:53:40.570449 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.570334 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-64t8l" podStartSLOduration=4.017495545 podStartE2EDuration="20.570319348s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.369711416 +0000 UTC m=+3.498919673" lastFinishedPulling="2026-04-24 23:53:39.922535206 +0000 UTC m=+20.051743476" observedRunningTime="2026-04-24 23:53:40.555906939 +0000 UTC m=+20.685115219" watchObservedRunningTime="2026-04-24 23:53:40.570319348 +0000 UTC m=+20.699527608" Apr 24 23:53:40.571188 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:40.571147 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xfncr" podStartSLOduration=11.890236294 podStartE2EDuration="20.571132519s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.356943835 +0000 UTC m=+3.486152108" lastFinishedPulling="2026-04-24 23:53:32.037840076 +0000 UTC m=+12.167048333" observedRunningTime="2026-04-24 23:53:40.570079798 +0000 UTC m=+20.699288074" watchObservedRunningTime="2026-04-24 23:53:40.571132519 +0000 UTC m=+20.700340799" Apr 24 23:53:41.421231 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.421141 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:41.421612 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:41.421257 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:41.500565 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.500537 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:53:41.500896 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.500873 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"ab494ad25d186113f3516e24505a14c13d6a01890ca33fd3fa7a404c7aaf0258"} Apr 24 23:53:41.500968 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.500908 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"0e688798dee8473e7df3d6dce2abbb9ad8341e38987efa6e8ddbd92afc897831"} Apr 24 23:53:41.500968 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.500919 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"08ee2e6e955c168a189d56146faf3b00e0993c74b3cd460d3f13b6489090c887"} Apr 24 23:53:41.502083 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.502049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2cb6b" event={"ID":"2e19a4fe-f15a-4907-a972-471635868ded","Type":"ContainerStarted","Data":"e7b8e8bcc817f77bab2971c600b067836825bd822c1882d20cb80b67cc099417"} Apr 24 23:53:41.517011 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.516969 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2cb6b" podStartSLOduration=4.95846812 podStartE2EDuration="21.516953457s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.353187684 +0000 UTC m=+3.482395941" lastFinishedPulling="2026-04-24 23:53:39.911673016 +0000 UTC m=+20.040881278" observedRunningTime="2026-04-24 23:53:41.516235525 +0000 UTC m=+21.645443877" watchObservedRunningTime="2026-04-24 23:53:41.516953457 +0000 UTC m=+21.646161738" Apr 24 23:53:41.734488 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:41.734456 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:53:42.399883 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.399829 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:42.400080 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.399893 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:42.400080 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:42.400004 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:42.400184 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:42.400084 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:42.409776 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.409679 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:53:41.734481888Z","UUID":"5dfed272-cc49-413d-a9aa-f629c7afdc50","Handler":null,"Name":"","Endpoint":""} Apr 24 23:53:42.412624 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.412602 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:53:42.412624 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.412629 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:53:42.505988 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:42.505956 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" event={"ID":"971d2d30-8f97-4832-a49b-14a3877e3eb3","Type":"ContainerStarted","Data":"0ca20c27ee7e85b0a3d8c48488b075ce1a46cae1b9f46be3032be3c52c6eabcb"} Apr 24 23:53:43.045916 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.045670 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:43.046411 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.046392 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:43.400552 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.400459 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:43.400693 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:43.400578 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:43.511602 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.511572 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:53:43.512014 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.511975 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"4f3efe8174e465581a2ee06d24e567660899d7f3134f7b24089dedebf87598ab"} Apr 24 23:53:43.513846 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.513810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" event={"ID":"971d2d30-8f97-4832-a49b-14a3877e3eb3","Type":"ContainerStarted","Data":"4d0f5034cfb823f76b73d7469d60f636051804b4b9b55c34e75cc2a9a4111aa7"} Apr 24 23:53:43.514558 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.514092 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:43.514682 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.514582 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xfncr" Apr 24 23:53:43.532170 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:43.532114 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jwk57" podStartSLOduration=3.779632985 podStartE2EDuration="23.5320976s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.364847474 +0000 UTC m=+3.494055745" lastFinishedPulling="2026-04-24 23:53:43.1173121 +0000 UTC m=+23.246520360" observedRunningTime="2026-04-24 23:53:43.531615118 +0000 UTC m=+23.660823398" watchObservedRunningTime="2026-04-24 23:53:43.5320976 +0000 UTC m=+23.661305881" Apr 24 23:53:44.399515 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:44.399481 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:44.399515 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:44.399509 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:44.399766 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:44.399606 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:44.399766 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:44.399720 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:45.400171 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.400149 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:45.400789 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:45.400277 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:45.520713 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.520578 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:53:45.521055 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.521032 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"5f4d87c8bad27b599b88cab58c16af2c6bf787f3997869d560b430fdf0a2d788"} Apr 24 23:53:45.521429 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.521408 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:45.521564 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.521438 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:45.521564 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.521450 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:45.521654 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.521572 2566 scope.go:117] "RemoveContainer" containerID="d14062444a2e3935dff05ea08920b406a350c33cd9480977a0a6c4d02b90683a" Apr 24 23:53:45.523262 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.523146 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerStarted","Data":"48fd271b5f370b0c0f0f5912a2195713397097296ae0db9c42f519f8a152635f"} Apr 24 23:53:45.540615 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.540559 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:45.540989 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:45.540968 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:53:46.400144 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.400113 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:46.400317 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.400154 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:46.400317 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:46.400262 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:46.400713 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:46.400370 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:46.526468 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.526438 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="48fd271b5f370b0c0f0f5912a2195713397097296ae0db9c42f519f8a152635f" exitCode=0 Apr 24 23:53:46.526621 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.526528 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"48fd271b5f370b0c0f0f5912a2195713397097296ae0db9c42f519f8a152635f"} Apr 24 23:53:46.529692 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.529673 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:53:46.529999 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.529978 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" event={"ID":"f63c8907-4f05-4332-84e3-9ca9c74f643c","Type":"ContainerStarted","Data":"f27f59a7af95634101e73dbb0c3bf144a1303ad56ad5286bd3939f7bac25fb2e"} Apr 24 23:53:46.577920 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:46.577865 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" podStartSLOduration=9.94316566 podStartE2EDuration="26.577847865s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.364102887 +0000 UTC m=+3.493311159" lastFinishedPulling="2026-04-24 23:53:39.998785091 +0000 UTC m=+20.127993364" observedRunningTime="2026-04-24 23:53:46.576615788 +0000 UTC m=+26.705824091" watchObservedRunningTime="2026-04-24 23:53:46.577847865 +0000 UTC m=+26.707056146" Apr 24 23:53:47.224527 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.224492 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6pqs"] Apr 24 23:53:47.224725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.224617 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:47.224772 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:47.224743 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:47.226709 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.226681 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bx6k8"] Apr 24 23:53:47.226831 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.226799 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:47.226923 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:47.226901 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:47.233107 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.233069 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vq8nz"] Apr 24 23:53:47.233295 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.233224 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:47.233546 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:47.233507 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:47.533014 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.532949 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="180d46f99f7d186a264fb83b9a3b5bd49895056e5c97d38644993b8fcf753121" exitCode=0 Apr 24 23:53:47.533359 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:47.533040 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"180d46f99f7d186a264fb83b9a3b5bd49895056e5c97d38644993b8fcf753121"} Apr 24 23:53:48.399549 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:48.399523 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:48.399684 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:48.399538 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:48.399730 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:48.399673 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:48.399730 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:48.399715 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:48.536620 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:48.536585 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="f74fcf3e41a71ff67755f0611e5a55207ed7a34e6be228c7559131f97901df36" exitCode=0 Apr 24 23:53:48.536931 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:48.536626 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"f74fcf3e41a71ff67755f0611e5a55207ed7a34e6be228c7559131f97901df36"} Apr 24 23:53:49.400002 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:49.399967 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:49.400137 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:49.400105 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:50.401015 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:50.400984 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:50.401637 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:50.401093 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:50.401637 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:50.401132 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:50.401637 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:50.401261 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:51.400469 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:51.400393 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:51.400606 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:51.400492 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vq8nz" podUID="97ecc28e-c411-4b57-86a8-d793acbd08ad" Apr 24 23:53:52.399809 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.399766 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:52.400315 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.399904 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:53:52.400315 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.399976 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:52.400315 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.400062 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bx6k8" podUID="4d4e321e-40d2-4107-9dbd-581cbfeb3ada" Apr 24 23:53:52.715726 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.715651 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-62.ec2.internal" event="NodeReady" Apr 24 23:53:52.715884 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.715818 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:53:52.748587 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.748561 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:53:52.751428 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.751414 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.753938 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.753917 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:53:52.754311 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.754290 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:53:52.754387 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.754347 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:53:52.754438 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.754347 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8xnt\"" Apr 24 23:53:52.760415 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.760398 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:53:52.762498 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.762481 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:53:52.765069 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.765040 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-75ql5"] Apr 24 23:53:52.768242 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.768222 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b956z"] Apr 24 23:53:52.768339 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.768321 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.770580 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.770556 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:53:52.770819 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.770804 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:53:52.770900 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.770823 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:53:52.771154 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.771139 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:52.773151 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.773133 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:53:52.773452 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.773437 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:53:52.773659 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.773645 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:53:52.775979 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.775958 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-75ql5"] Apr 24 23:53:52.776401 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.776385 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:53:52.788274 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.788255 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b956z"] Apr 24 23:53:52.889533 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889495 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9qh\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889533 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889539 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58vn9\" (UniqueName: \"kubernetes.io/projected/47c85934-321e-42e7-9abd-19c5dc8818e0-kube-api-access-58vn9\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:52.889712 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889617 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c180154-dc8b-44dc-a86b-9564a07e09c5-config-volume\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.889712 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889665 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889712 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889682 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:52.889712 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889701 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889847 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889847 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889744 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889847 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889847 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.889847 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889833 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.889976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c180154-dc8b-44dc-a86b-9564a07e09c5-tmp-dir\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.889976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889916 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlmtm\" (UniqueName: \"kubernetes.io/projected/6c180154-dc8b-44dc-a86b-9564a07e09c5-kube-api-access-dlmtm\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.889976 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.889949 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.990852 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c180154-dc8b-44dc-a86b-9564a07e09c5-tmp-dir\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.990852 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlmtm\" (UniqueName: \"kubernetes.io/projected/6c180154-dc8b-44dc-a86b-9564a07e09c5-kube-api-access-dlmtm\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.991049 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991049 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990875 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9qh\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991049 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990890 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58vn9\" (UniqueName: \"kubernetes.io/projected/47c85934-321e-42e7-9abd-19c5dc8818e0-kube-api-access-58vn9\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:52.991049 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.990927 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c180154-dc8b-44dc-a86b-9564a07e09c5-config-volume\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.991049 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991064 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991119 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6c180154-dc8b-44dc-a86b-9564a07e09c5-tmp-dir\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991138 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991191 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991235 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.991297 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991292 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.491269124 +0000 UTC m=+33.620477397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991336 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991402 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991475 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991488 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991539 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.491522917 +0000 UTC m=+33.620731174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991590 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:52.991655 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.491636793 +0000 UTC m=+33.620845050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991656 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c180154-dc8b-44dc-a86b-9564a07e09c5-config-volume\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:52.991725 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.991670 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.992081 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.992062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.995083 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.995060 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:52.995182 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:52.995068 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:53.006226 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.006167 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9qh\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:53.006462 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.006435 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58vn9\" (UniqueName: \"kubernetes.io/projected/47c85934-321e-42e7-9abd-19c5dc8818e0-kube-api-access-58vn9\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:53.006939 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.006919 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:53.007388 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.007371 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlmtm\" (UniqueName: \"kubernetes.io/projected/6c180154-dc8b-44dc-a86b-9564a07e09c5-kube-api-access-dlmtm\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:53.400131 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.400044 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:53.403327 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.403136 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:53:53.403327 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.403200 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:53:53.403327 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.403287 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wfsx7\"" Apr 24 23:53:53.495946 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.495913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:53.496115 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.495965 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:53.496115 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:53.496059 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:53.496115 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496089 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:53:53.496115 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496109 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:53:53.496319 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496137 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:53:53.496319 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496159 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:53:53.496319 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496176 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.496161405 +0000 UTC m=+34.625369667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:53:53.496319 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496223 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.496185886 +0000 UTC m=+34.625394160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:53:53.496319 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:53.496246 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:54.496236472 +0000 UTC m=+34.625444736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:53:54.100922 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.100879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:54.101101 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.100939 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:54.101101 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.101049 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:54.101173 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.101115 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:54:26.101097566 +0000 UTC m=+66.230305842 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:54.103978 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.103953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw76w\" (UniqueName: \"kubernetes.io/projected/97ecc28e-c411-4b57-86a8-d793acbd08ad-kube-api-access-qw76w\") pod \"network-check-target-vq8nz\" (UID: \"97ecc28e-c411-4b57-86a8-d793acbd08ad\") " pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:54.202362 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.202326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:54.202542 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.202480 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:54.202600 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.202550 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret podName:4d4e321e-40d2-4107-9dbd-581cbfeb3ada nodeName:}" failed. No retries permitted until 2026-04-24 23:54:26.202531827 +0000 UTC m=+66.331740097 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret") pod "global-pull-secret-syncer-bx6k8" (UID: "4d4e321e-40d2-4107-9dbd-581cbfeb3ada") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:54.310785 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.310744 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:54.400071 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.399988 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:53:54.400249 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.399989 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:53:54.402835 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.402797 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:53:54.402835 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.402799 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:53:54.403859 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.403838 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:53:54.504064 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.504025 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.504088 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504150 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504174 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504188 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:54.504195 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:54.504269 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504248 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:56.504229732 +0000 UTC m=+36.633438013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:53:54.504507 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504306 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:53:54.504507 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504382 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:56.504367049 +0000 UTC m=+36.633575310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:53:54.504507 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:54.504402 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:56.504391711 +0000 UTC m=+36.633599975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:53:55.240024 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:55.239804 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vq8nz"] Apr 24 23:53:55.334399 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:53:55.334291 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97ecc28e_c411_4b57_86a8_d793acbd08ad.slice/crio-b6841e8fdc82a3a3f984e93cd2348acdb4241ea2d9d54eb9ee686d2cf17a0cc6 WatchSource:0}: Error finding container b6841e8fdc82a3a3f984e93cd2348acdb4241ea2d9d54eb9ee686d2cf17a0cc6: Status 404 returned error can't find the container with id b6841e8fdc82a3a3f984e93cd2348acdb4241ea2d9d54eb9ee686d2cf17a0cc6 Apr 24 23:53:55.550845 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:55.550808 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vq8nz" event={"ID":"97ecc28e-c411-4b57-86a8-d793acbd08ad","Type":"ContainerStarted","Data":"b6841e8fdc82a3a3f984e93cd2348acdb4241ea2d9d54eb9ee686d2cf17a0cc6"} Apr 24 23:53:55.553466 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:55.553437 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerStarted","Data":"ed0d8a651663a54cf1acf4131e378c49547efc0e2aaa0ec0f9b09553d6f248fe"} Apr 24 23:53:56.518284 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:56.518252 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:56.518295 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:56.518319 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518404 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518415 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518430 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:53:56.518452 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518432 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:53:56.518641 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518465 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.518451451 +0000 UTC m=+40.647659708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:53:56.518641 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518480 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.518474208 +0000 UTC m=+40.647682465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:53:56.518641 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:53:56.518490 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.51848537 +0000 UTC m=+40.647693628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:53:56.557476 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:56.557448 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="ed0d8a651663a54cf1acf4131e378c49547efc0e2aaa0ec0f9b09553d6f248fe" exitCode=0 Apr 24 23:53:56.557779 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:56.557492 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"ed0d8a651663a54cf1acf4131e378c49547efc0e2aaa0ec0f9b09553d6f248fe"} Apr 24 23:53:57.562582 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:57.562547 2566 generic.go:358] "Generic (PLEG): container finished" podID="2aa55bd4-e281-4226-85cb-d9aa2ce0bd34" containerID="68e5ad08669cbeaba48e19f2eab763e9c98543f2e62ffdcb079fb11944088416" exitCode=0 Apr 24 23:53:57.563126 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:57.562628 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerDied","Data":"68e5ad08669cbeaba48e19f2eab763e9c98543f2e62ffdcb079fb11944088416"} Apr 24 23:53:59.570072 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:59.570034 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" event={"ID":"2aa55bd4-e281-4226-85cb-d9aa2ce0bd34","Type":"ContainerStarted","Data":"cf1a307d7a6513c195e3fc6f28915bb032affbf49c0ae5fc370e85afebce967a"} Apr 24 23:53:59.571272 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:59.571251 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vq8nz" event={"ID":"97ecc28e-c411-4b57-86a8-d793acbd08ad","Type":"ContainerStarted","Data":"6e0cce0118ae93b1337489dc341b86963d5bf691ce743b5747f5887fee354dd5"} Apr 24 23:53:59.571428 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:59.571414 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:53:59.609967 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:59.609918 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pcfrr" podStartSLOduration=7.606817392 podStartE2EDuration="39.609904343s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:23.357352406 +0000 UTC m=+3.486560666" lastFinishedPulling="2026-04-24 23:53:55.360439345 +0000 UTC m=+35.489647617" observedRunningTime="2026-04-24 23:53:59.608336888 +0000 UTC m=+39.737545192" watchObservedRunningTime="2026-04-24 23:53:59.609904343 +0000 UTC m=+39.739112622" Apr 24 23:53:59.631689 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:53:59.631643 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vq8nz" podStartSLOduration=36.013245485 podStartE2EDuration="39.631628186s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:53:55.339108131 +0000 UTC m=+35.468316389" lastFinishedPulling="2026-04-24 23:53:58.957490819 +0000 UTC m=+39.086699090" observedRunningTime="2026-04-24 23:53:59.630316669 +0000 UTC m=+39.759524947" watchObservedRunningTime="2026-04-24 23:53:59.631628186 +0000 UTC m=+39.760836461" Apr 24 23:54:00.549221 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:00.549155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:54:00.549395 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:00.549246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:54:00.549395 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:00.549277 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:54:00.549395 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549308 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:00.549395 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549369 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:00.549518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549389 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:00.549518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549411 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:54:00.549518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549371 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.549352375 +0000 UTC m=+48.678560646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:54:00.549518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549477 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.54945576 +0000 UTC m=+48.678664022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:54:00.549518 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:00.549495 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:08.54948863 +0000 UTC m=+48.678696887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:54:08.606027 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:08.605993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:54:08.606027 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:08.606037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606155 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606179 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606225 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606246 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.606231512 +0000 UTC m=+64.735439769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606263 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.606252428 +0000 UTC m=+64.735460685 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:08.606357 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606487 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:08.606570 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:08.606528 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.606518225 +0000 UTC m=+64.735726482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:54:17.549518 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:17.549487 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfv9v" Apr 24 23:54:23.578926 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.578891 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d"] Apr 24 23:54:23.585657 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.585632 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.588143 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.588121 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 23:54:23.588274 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.588151 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:23.588274 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.588150 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:23.589128 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.589113 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:23.592658 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.592640 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d"] Apr 24 23:54:23.619991 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.619963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bcf91163-5540-4da4-a8d2-6e769ef3cc47-klusterlet-config\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.620128 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.619998 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcf91163-5540-4da4-a8d2-6e769ef3cc47-tmp\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.620191 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.620120 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjzk\" (UniqueName: \"kubernetes.io/projected/bcf91163-5540-4da4-a8d2-6e769ef3cc47-kube-api-access-vsjzk\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.720719 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.720676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bcf91163-5540-4da4-a8d2-6e769ef3cc47-klusterlet-config\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.720719 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.720721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcf91163-5540-4da4-a8d2-6e769ef3cc47-tmp\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.720971 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.720790 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjzk\" (UniqueName: \"kubernetes.io/projected/bcf91163-5540-4da4-a8d2-6e769ef3cc47-kube-api-access-vsjzk\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.721251 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.721201 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcf91163-5540-4da4-a8d2-6e769ef3cc47-tmp\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.725074 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.725056 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bcf91163-5540-4da4-a8d2-6e769ef3cc47-klusterlet-config\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.729661 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.729634 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjzk\" (UniqueName: \"kubernetes.io/projected/bcf91163-5540-4da4-a8d2-6e769ef3cc47-kube-api-access-vsjzk\") pod \"klusterlet-addon-workmgr-fc7879598-jxq2d\" (UID: \"bcf91163-5540-4da4-a8d2-6e769ef3cc47\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:23.895489 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:23.895381 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:24.015715 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:24.015686 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d"] Apr 24 23:54:24.018980 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:54:24.018954 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf91163_5540_4da4_a8d2_6e769ef3cc47.slice/crio-815c151aa4bf4e7ceb62cbe2dc3133fe7fbe4a01483a93cbc5ace93c8e0d92a3 WatchSource:0}: Error finding container 815c151aa4bf4e7ceb62cbe2dc3133fe7fbe4a01483a93cbc5ace93c8e0d92a3: Status 404 returned error can't find the container with id 815c151aa4bf4e7ceb62cbe2dc3133fe7fbe4a01483a93cbc5ace93c8e0d92a3 Apr 24 23:54:24.617474 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:24.617440 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" event={"ID":"bcf91163-5540-4da4-a8d2-6e769ef3cc47","Type":"ContainerStarted","Data":"815c151aa4bf4e7ceb62cbe2dc3133fe7fbe4a01483a93cbc5ace93c8e0d92a3"} Apr 24 23:54:24.626969 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:24.626936 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:54:24.627111 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:24.626984 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:54:24.627111 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:24.627041 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:54:24.627111 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627102 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627122 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627138 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627158 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627199 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:56.627178893 +0000 UTC m=+96.756387169 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627242 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:56.627223647 +0000 UTC m=+96.756431908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:54:24.627301 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:24.627261 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:56.627251629 +0000 UTC m=+96.756459892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:54:26.145495 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.145448 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:54:26.148365 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.148336 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:26.156629 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:26.156607 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:54:26.156768 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:26.156685 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:55:30.156661924 +0000 UTC m=+130.285870201 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : secret "metrics-daemon-secret" not found Apr 24 23:54:26.246788 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.246753 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:54:26.249590 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.249561 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:26.260314 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.260288 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d4e321e-40d2-4107-9dbd-581cbfeb3ada-original-pull-secret\") pod \"global-pull-secret-syncer-bx6k8\" (UID: \"4d4e321e-40d2-4107-9dbd-581cbfeb3ada\") " pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:54:26.511437 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:26.511353 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bx6k8" Apr 24 23:54:27.507158 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.507135 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bx6k8"] Apr 24 23:54:27.510159 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:54:27.510129 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4e321e_40d2_4107_9dbd_581cbfeb3ada.slice/crio-6a40c70a30611933b5d6ee1f4dfd6cca6ef8f331d25225c395f51625a7f6b65a WatchSource:0}: Error finding container 6a40c70a30611933b5d6ee1f4dfd6cca6ef8f331d25225c395f51625a7f6b65a: Status 404 returned error can't find the container with id 6a40c70a30611933b5d6ee1f4dfd6cca6ef8f331d25225c395f51625a7f6b65a Apr 24 23:54:27.625047 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.625013 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" event={"ID":"bcf91163-5540-4da4-a8d2-6e769ef3cc47","Type":"ContainerStarted","Data":"9795097f91ae903fcd1eb238970cfa09bdf3c0bcf3f24ba5fc3f16ea4bc5e24e"} Apr 24 23:54:27.625238 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.625189 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:27.626126 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.626099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bx6k8" event={"ID":"4d4e321e-40d2-4107-9dbd-581cbfeb3ada","Type":"ContainerStarted","Data":"6a40c70a30611933b5d6ee1f4dfd6cca6ef8f331d25225c395f51625a7f6b65a"} Apr 24 23:54:27.626819 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.626801 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" Apr 24 23:54:27.640338 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:27.640293 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-fc7879598-jxq2d" podStartSLOduration=1.192325493 podStartE2EDuration="4.640281811s" podCreationTimestamp="2026-04-24 23:54:23 +0000 UTC" firstStartedPulling="2026-04-24 23:54:24.0211913 +0000 UTC m=+64.150399561" lastFinishedPulling="2026-04-24 23:54:27.46914761 +0000 UTC m=+67.598355879" observedRunningTime="2026-04-24 23:54:27.639855761 +0000 UTC m=+67.769064038" watchObservedRunningTime="2026-04-24 23:54:27.640281811 +0000 UTC m=+67.769490126" Apr 24 23:54:30.575449 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:30.575416 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vq8nz" Apr 24 23:54:31.635535 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:31.635501 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bx6k8" event={"ID":"4d4e321e-40d2-4107-9dbd-581cbfeb3ada","Type":"ContainerStarted","Data":"cf0b8e6f3cfda122fdfc60c33578137311bc3634fadad39d9ee16bd13fef9ffa"} Apr 24 23:54:31.650137 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:31.650090 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bx6k8" podStartSLOduration=68.377033819 podStartE2EDuration="1m11.650075384s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:54:27.511920923 +0000 UTC m=+67.641129181" lastFinishedPulling="2026-04-24 23:54:30.784962486 +0000 UTC m=+70.914170746" observedRunningTime="2026-04-24 23:54:31.649650756 +0000 UTC m=+71.778859036" watchObservedRunningTime="2026-04-24 23:54:31.650075384 +0000 UTC m=+71.779283663" Apr 24 23:54:56.663429 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:56.663390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:56.663452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:54:56.663480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663556 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663566 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69dc66854d-xqpjn: secret "image-registry-tls" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663566 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663591 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663638 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls podName:a185de46-a1c8-4c54-b1dc-d4928ab48ce4 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.663625473 +0000 UTC m=+160.792833730 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls") pod "image-registry-69dc66854d-xqpjn" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4") : secret "image-registry-tls" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663650 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls podName:6c180154-dc8b-44dc-a86b-9564a07e09c5 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.66364481 +0000 UTC m=+160.792853067 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls") pod "dns-default-75ql5" (UID: "6c180154-dc8b-44dc-a86b-9564a07e09c5") : secret "dns-default-metrics-tls" not found Apr 24 23:54:56.663876 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:54:56.663666 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert podName:47c85934-321e-42e7-9abd-19c5dc8818e0 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:00.663660779 +0000 UTC m=+160.792869036 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert") pod "ingress-canary-b956z" (UID: "47c85934-321e-42e7-9abd-19c5dc8818e0") : secret "canary-serving-cert" not found Apr 24 23:55:30.205189 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:30.205136 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:55:30.205723 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:30.205278 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:30.205723 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:30.205333 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs podName:f9f062da-f1a8-4e5a-ac2f-ad672791353b nodeName:}" failed. No retries permitted until 2026-04-24 23:57:32.205317352 +0000 UTC m=+252.334525609 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs") pod "network-metrics-daemon-c6pqs" (UID: "f9f062da-f1a8-4e5a-ac2f-ad672791353b") : secret "metrics-daemon-secret" not found Apr 24 23:55:45.684615 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.684568 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-84b54f7d86-6hsq2"] Apr 24 23:55:45.687303 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.687286 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.690135 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.690114 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 23:55:45.690379 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.690356 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 23:55:45.690535 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.690357 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qc6hx\"" Apr 24 23:55:45.690658 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.690642 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.690717 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.690647 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 23:55:45.691194 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.691180 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.691262 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.691195 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 23:55:45.702614 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.702595 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84b54f7d86-6hsq2"] Apr 24 23:55:45.786333 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.786304 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp"] Apr 24 23:55:45.789874 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.789828 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:45.794217 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.794187 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 23:55:45.794781 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.794762 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mj8gp\"" Apr 24 23:55:45.795126 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.795104 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:55:45.795289 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.795269 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:55:45.796428 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.796410 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 23:55:45.800813 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.800792 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp"] Apr 24 23:55:45.816598 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.816570 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-default-certificate\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.816685 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.816607 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsh7\" (UniqueName: \"kubernetes.io/projected/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-kube-api-access-mhsh7\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.816685 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.816659 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-stats-auth\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.816755 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.816685 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.816755 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.816713 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917382 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917355 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-stats-auth\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917382 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917530 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917530 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:45.917505 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:55:45.917590 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:45.917514 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.417501403 +0000 UTC m=+146.546709673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:55:45.917629 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917607 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c6db25df-5f50-488f-94b9-8d2c23f69077-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:45.917691 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:45.917674 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.417653575 +0000 UTC m=+146.546861840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:55:45.917748 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917733 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-default-certificate\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917785 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917759 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsh7\" (UniqueName: \"kubernetes.io/projected/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-kube-api-access-mhsh7\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.917837 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917783 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrlmx\" (UniqueName: \"kubernetes.io/projected/c6db25df-5f50-488f-94b9-8d2c23f69077-kube-api-access-xrlmx\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:45.917837 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.917818 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:45.919760 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.919734 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-stats-auth\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.919951 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.919935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-default-certificate\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:45.925770 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:45.925752 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsh7\" (UniqueName: \"kubernetes.io/projected/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-kube-api-access-mhsh7\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:46.018799 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.018731 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c6db25df-5f50-488f-94b9-8d2c23f69077-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.018799 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.018784 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrlmx\" (UniqueName: \"kubernetes.io/projected/c6db25df-5f50-488f-94b9-8d2c23f69077-kube-api-access-xrlmx\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.018958 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.018815 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.018958 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.018942 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:46.019024 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.019019 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:46.519000513 +0000 UTC m=+146.648208770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:46.019458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.019440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c6db25df-5f50-488f-94b9-8d2c23f69077-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.027613 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.027587 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrlmx\" (UniqueName: \"kubernetes.io/projected/c6db25df-5f50-488f-94b9-8d2c23f69077-kube-api-access-xrlmx\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.422140 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.422093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:46.422140 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.422150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:46.422367 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.422251 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:55:46.422367 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.422285 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.422270147 +0000 UTC m=+147.551478406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:55:46.422367 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.422302 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.422294542 +0000 UTC m=+147.551502799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:55:46.522478 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:46.522442 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:46.522626 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.522584 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:46.522692 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:46.522657 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:47.52264242 +0000 UTC m=+147.651850678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:47.429665 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:47.429629 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:47.430103 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:47.429676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:47.430103 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:47.429796 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:55:47.430103 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:47.429804 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.429791978 +0000 UTC m=+149.559000235 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:55:47.430103 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:47.429884 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.429864825 +0000 UTC m=+149.559073094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:55:47.530531 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:47.530497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:47.530693 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:47.530611 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:47.530693 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:47.530660 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:49.530647248 +0000 UTC m=+149.659855505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:49.442574 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:49.442537 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:49.442574 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:49.442582 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:49.443022 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:49.442690 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:55:49.443022 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:49.442717 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.442703687 +0000 UTC m=+153.571911944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:55:49.443022 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:49.442741 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.442728805 +0000 UTC m=+153.571937062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:55:49.543626 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:49.543585 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:49.543797 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:49.543739 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:49.543842 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:49.543803 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:53.543788489 +0000 UTC m=+153.672996750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:53.315307 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.315272 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz5vx"] Apr 24 23:55:53.318296 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.318280 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.320905 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.320878 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 23:55:53.322023 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.322000 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 23:55:53.322131 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.322047 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 23:55:53.322131 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.322090 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-v4hdc\"" Apr 24 23:55:53.322131 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.322095 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 23:55:53.325535 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.325511 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz5vx"] Apr 24 23:55:53.372161 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.372117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpm8r\" (UniqueName: \"kubernetes.io/projected/2fd0e935-473a-44c8-8b58-312212a8cd07-kube-api-access-mpm8r\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.372368 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.372198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-key\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.372368 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.372312 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-cabundle\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.472989 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.472951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-key\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.473159 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.472998 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:53.473159 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.473039 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:55:53.473159 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.473060 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-cabundle\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.473159 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.473145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpm8r\" (UniqueName: \"kubernetes.io/projected/2fd0e935-473a-44c8-8b58-312212a8cd07-kube-api-access-mpm8r\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.473377 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:53.473226 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.473187149 +0000 UTC m=+161.602395406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:55:53.473377 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:53.473263 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:55:53.473377 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:53.473330 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.473313144 +0000 UTC m=+161.602521405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:55:53.473763 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.473745 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-cabundle\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.475397 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.475382 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2fd0e935-473a-44c8-8b58-312212a8cd07-signing-key\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.481816 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.481797 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpm8r\" (UniqueName: \"kubernetes.io/projected/2fd0e935-473a-44c8-8b58-312212a8cd07-kube-api-access-mpm8r\") pod \"service-ca-865cb79987-pz5vx\" (UID: \"2fd0e935-473a-44c8-8b58-312212a8cd07\") " pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.574492 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.574406 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:55:53.574638 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:53.574578 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:53.574700 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:53.574674 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:01.574654487 +0000 UTC m=+161.703862746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:55:53.627941 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.627901 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-pz5vx" Apr 24 23:55:53.740147 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.740114 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-pz5vx"] Apr 24 23:55:53.743164 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:55:53.743135 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd0e935_473a_44c8_8b58_312212a8cd07.slice/crio-633553be9cf22f4bf11785aff60b8ddcba15a0c75320cb1c7b4d5b079dc23ab6 WatchSource:0}: Error finding container 633553be9cf22f4bf11785aff60b8ddcba15a0c75320cb1c7b4d5b079dc23ab6: Status 404 returned error can't find the container with id 633553be9cf22f4bf11785aff60b8ddcba15a0c75320cb1c7b4d5b079dc23ab6 Apr 24 23:55:53.787569 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:53.787536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pz5vx" event={"ID":"2fd0e935-473a-44c8-8b58-312212a8cd07","Type":"ContainerStarted","Data":"633553be9cf22f4bf11785aff60b8ddcba15a0c75320cb1c7b4d5b079dc23ab6"} Apr 24 23:55:54.250867 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:54.250838 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jgjtn_3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2/dns-node-resolver/0.log" Apr 24 23:55:54.851150 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:54.851121 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-64t8l_7fd21580-2e57-4cb2-8470-18fa0629553c/node-ca/0.log" Apr 24 23:55:55.761901 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:55.761852 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" Apr 24 23:55:55.777718 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:55.777681 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-75ql5" podUID="6c180154-dc8b-44dc-a86b-9564a07e09c5" Apr 24 23:55:55.782975 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:55.782945 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b956z" podUID="47c85934-321e-42e7-9abd-19c5dc8818e0" Apr 24 23:55:55.792623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:55.792594 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:55:55.792623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:55.792574 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-pz5vx" event={"ID":"2fd0e935-473a-44c8-8b58-312212a8cd07","Type":"ContainerStarted","Data":"45b7b2cb2f671c6486571b62ea580fe15ae4a234194fcb8d31aba6804e38686b"} Apr 24 23:55:55.792787 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:55.792594 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:55:55.792787 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:55.792706 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75ql5" Apr 24 23:55:55.809698 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:55:55.809647 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-pz5vx" podStartSLOduration=1.275961559 podStartE2EDuration="2.809632896s" podCreationTimestamp="2026-04-24 23:55:53 +0000 UTC" firstStartedPulling="2026-04-24 23:55:53.744954328 +0000 UTC m=+153.874162586" lastFinishedPulling="2026-04-24 23:55:55.278625656 +0000 UTC m=+155.407833923" observedRunningTime="2026-04-24 23:55:55.809035798 +0000 UTC m=+155.938244076" watchObservedRunningTime="2026-04-24 23:55:55.809632896 +0000 UTC m=+155.938841175" Apr 24 23:55:57.417109 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:55:57.417065 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-c6pqs" podUID="f9f062da-f1a8-4e5a-ac2f-ad672791353b" Apr 24 23:56:00.736280 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.736246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:56:00.736687 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.736290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:56:00.736687 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.736425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:56:00.738504 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.738484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6c180154-dc8b-44dc-a86b-9564a07e09c5-metrics-tls\") pod \"dns-default-75ql5\" (UID: \"6c180154-dc8b-44dc-a86b-9564a07e09c5\") " pod="openshift-dns/dns-default-75ql5" Apr 24 23:56:00.738731 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.738711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47c85934-321e-42e7-9abd-19c5dc8818e0-cert\") pod \"ingress-canary-b956z\" (UID: \"47c85934-321e-42e7-9abd-19c5dc8818e0\") " pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:56:00.738795 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.738711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"image-registry-69dc66854d-xqpjn\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:56:00.897622 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.897590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:56:00.897622 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.897605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:56:00.897622 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.897590 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8xnt\"" Apr 24 23:56:00.904852 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.904837 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b956z" Apr 24 23:56:00.904906 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.904865 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75ql5" Apr 24 23:56:00.905029 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:00.905012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:56:01.054274 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.054238 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:56:01.057152 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:01.057110 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda185de46_a1c8_4c54_b1dc_d4928ab48ce4.slice/crio-3b1a524e07785e8c01d0bc69ca77cf5edd0fc17ad9f934f1397ed14406f69aa7 WatchSource:0}: Error finding container 3b1a524e07785e8c01d0bc69ca77cf5edd0fc17ad9f934f1397ed14406f69aa7: Status 404 returned error can't find the container with id 3b1a524e07785e8c01d0bc69ca77cf5edd0fc17ad9f934f1397ed14406f69aa7 Apr 24 23:56:01.266244 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.266132 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b956z"] Apr 24 23:56:01.268885 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.268615 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-75ql5"] Apr 24 23:56:01.271348 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:01.271318 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c85934_321e_42e7_9abd_19c5dc8818e0.slice/crio-4909f1d8ce5ad6fc6512953a6d70022583a28816706769f857ea1f9a4e631998 WatchSource:0}: Error finding container 4909f1d8ce5ad6fc6512953a6d70022583a28816706769f857ea1f9a4e631998: Status 404 returned error can't find the container with id 4909f1d8ce5ad6fc6512953a6d70022583a28816706769f857ea1f9a4e631998 Apr 24 23:56:01.275096 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:01.275071 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c180154_dc8b_44dc_a86b_9564a07e09c5.slice/crio-4cc5a4ae7eadf28be09a981732eb528820e779e78bf5ebb71d8872df16550586 WatchSource:0}: Error finding container 4cc5a4ae7eadf28be09a981732eb528820e779e78bf5ebb71d8872df16550586: Status 404 returned error can't find the container with id 4cc5a4ae7eadf28be09a981732eb528820e779e78bf5ebb71d8872df16550586 Apr 24 23:56:01.543602 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.543565 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:01.543735 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.543618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:01.543735 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:01.543698 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 23:56:01.543814 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:01.543761 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.543747442 +0000 UTC m=+177.672955703 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : secret "router-metrics-certs-default" not found Apr 24 23:56:01.543814 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:01.543774 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle podName:3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.543769152 +0000 UTC m=+177.672977413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle") pod "router-default-84b54f7d86-6hsq2" (UID: "3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31") : configmap references non-existent config key: service-ca.crt Apr 24 23:56:01.644143 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.644099 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:56:01.644311 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:01.644243 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:01.644355 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:01.644334 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls podName:c6db25df-5f50-488f-94b9-8d2c23f69077 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:17.644312171 +0000 UTC m=+177.773520501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hsllp" (UID: "c6db25df-5f50-488f-94b9-8d2c23f69077") : secret "cluster-monitoring-operator-tls" not found Apr 24 23:56:01.806944 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.806857 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" event={"ID":"a185de46-a1c8-4c54-b1dc-d4928ab48ce4","Type":"ContainerStarted","Data":"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05"} Apr 24 23:56:01.806944 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.806898 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" event={"ID":"a185de46-a1c8-4c54-b1dc-d4928ab48ce4","Type":"ContainerStarted","Data":"3b1a524e07785e8c01d0bc69ca77cf5edd0fc17ad9f934f1397ed14406f69aa7"} Apr 24 23:56:01.807442 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.806990 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:56:01.808235 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.808189 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75ql5" event={"ID":"6c180154-dc8b-44dc-a86b-9564a07e09c5","Type":"ContainerStarted","Data":"4cc5a4ae7eadf28be09a981732eb528820e779e78bf5ebb71d8872df16550586"} Apr 24 23:56:01.809301 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.809261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b956z" event={"ID":"47c85934-321e-42e7-9abd-19c5dc8818e0","Type":"ContainerStarted","Data":"4909f1d8ce5ad6fc6512953a6d70022583a28816706769f857ea1f9a4e631998"} Apr 24 23:56:01.827018 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:01.826952 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" podStartSLOduration=161.826938468 podStartE2EDuration="2m41.826938468s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:01.825793468 +0000 UTC m=+161.955001761" watchObservedRunningTime="2026-04-24 23:56:01.826938468 +0000 UTC m=+161.956146746" Apr 24 23:56:03.816874 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.816841 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75ql5" event={"ID":"6c180154-dc8b-44dc-a86b-9564a07e09c5","Type":"ContainerStarted","Data":"6cda3c70153a10f5429bd19ac5811800a2476ff237a0cef5d8bd064692fdb23c"} Apr 24 23:56:03.816874 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.816879 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75ql5" event={"ID":"6c180154-dc8b-44dc-a86b-9564a07e09c5","Type":"ContainerStarted","Data":"4b3d0cc0a196c14c9c8d9fa2c076b987f5736dd42940f49c47ef6e05b013bddb"} Apr 24 23:56:03.817623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.816926 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-75ql5" Apr 24 23:56:03.818249 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.818223 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b956z" event={"ID":"47c85934-321e-42e7-9abd-19c5dc8818e0","Type":"ContainerStarted","Data":"18dcee25656347153d321ef8abf09169d49d18a4ec80cdbd6e5c33b6fbbe09fa"} Apr 24 23:56:03.837011 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.836967 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-75ql5" podStartSLOduration=129.937780743 podStartE2EDuration="2m11.836955677s" podCreationTimestamp="2026-04-24 23:53:52 +0000 UTC" firstStartedPulling="2026-04-24 23:56:01.277282849 +0000 UTC m=+161.406491121" lastFinishedPulling="2026-04-24 23:56:03.176457781 +0000 UTC m=+163.305666055" observedRunningTime="2026-04-24 23:56:03.835910267 +0000 UTC m=+163.965118547" watchObservedRunningTime="2026-04-24 23:56:03.836955677 +0000 UTC m=+163.966163954" Apr 24 23:56:03.852402 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:03.852354 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b956z" podStartSLOduration=129.94641862 podStartE2EDuration="2m11.852338792s" podCreationTimestamp="2026-04-24 23:53:52 +0000 UTC" firstStartedPulling="2026-04-24 23:56:01.273530326 +0000 UTC m=+161.402738599" lastFinishedPulling="2026-04-24 23:56:03.179450512 +0000 UTC m=+163.308658771" observedRunningTime="2026-04-24 23:56:03.851131019 +0000 UTC m=+163.980339297" watchObservedRunningTime="2026-04-24 23:56:03.852338792 +0000 UTC m=+163.981547072" Apr 24 23:56:12.400329 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:12.400284 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:56:13.823044 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:13.823009 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-75ql5" Apr 24 23:56:16.517130 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.517077 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-sg89f"] Apr 24 23:56:16.522239 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.522192 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:16.524875 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.524850 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sj5qb\"" Apr 24 23:56:16.524988 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.524963 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 23:56:16.525259 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.525245 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 23:56:16.534892 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.534872 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sg89f"] Apr 24 23:56:16.656455 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.656424 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gvs6p"] Apr 24 23:56:16.659540 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.659524 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.662128 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.662104 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.662267 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.662179 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:56:16.662267 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.662249 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:56:16.662594 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.662577 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rp52n\"" Apr 24 23:56:16.662719 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.662703 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.665575 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.665555 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fspgh\" (UniqueName: \"kubernetes.io/projected/aa5c2105-93ed-4d18-af5a-1ffb27c236a3-kube-api-access-fspgh\") pod \"downloads-6bcc868b7-sg89f\" (UID: \"aa5c2105-93ed-4d18-af5a-1ffb27c236a3\") " pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:16.674155 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.674132 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvs6p"] Apr 24 23:56:16.766061 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766023 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97e517a0-5f4e-40ee-b905-26216127aa87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.766269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766144 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97e517a0-5f4e-40ee-b905-26216127aa87-crio-socket\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.766269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766190 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fspgh\" (UniqueName: \"kubernetes.io/projected/aa5c2105-93ed-4d18-af5a-1ffb27c236a3-kube-api-access-fspgh\") pod \"downloads-6bcc868b7-sg89f\" (UID: \"aa5c2105-93ed-4d18-af5a-1ffb27c236a3\") " pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:16.766269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766220 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97e517a0-5f4e-40ee-b905-26216127aa87-data-volume\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.766269 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766252 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97e517a0-5f4e-40ee-b905-26216127aa87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.766422 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.766272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fznvd\" (UniqueName: \"kubernetes.io/projected/97e517a0-5f4e-40ee-b905-26216127aa87-kube-api-access-fznvd\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.792248 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.792199 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fspgh\" (UniqueName: \"kubernetes.io/projected/aa5c2105-93ed-4d18-af5a-1ffb27c236a3-kube-api-access-fspgh\") pod \"downloads-6bcc868b7-sg89f\" (UID: \"aa5c2105-93ed-4d18-af5a-1ffb27c236a3\") " pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:16.830727 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.830691 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:16.867180 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97e517a0-5f4e-40ee-b905-26216127aa87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867365 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97e517a0-5f4e-40ee-b905-26216127aa87-crio-socket\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867365 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97e517a0-5f4e-40ee-b905-26216127aa87-data-volume\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867365 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97e517a0-5f4e-40ee-b905-26216127aa87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867382 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fznvd\" (UniqueName: \"kubernetes.io/projected/97e517a0-5f4e-40ee-b905-26216127aa87-kube-api-access-fznvd\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97e517a0-5f4e-40ee-b905-26216127aa87-crio-socket\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867697 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867674 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97e517a0-5f4e-40ee-b905-26216127aa87-data-volume\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.867940 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.867925 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97e517a0-5f4e-40ee-b905-26216127aa87-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.869468 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.869437 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97e517a0-5f4e-40ee-b905-26216127aa87-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.878052 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.878026 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fznvd\" (UniqueName: \"kubernetes.io/projected/97e517a0-5f4e-40ee-b905-26216127aa87-kube-api-access-fznvd\") pod \"insights-runtime-extractor-gvs6p\" (UID: \"97e517a0-5f4e-40ee-b905-26216127aa87\") " pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:16.947507 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.947475 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-sg89f"] Apr 24 23:56:16.950754 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:16.950721 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa5c2105_93ed_4d18_af5a_1ffb27c236a3.slice/crio-dc23e2b0deed5f652ff94018aab06ac118fd0191ee0fb1770bec1fed6c1a9a73 WatchSource:0}: Error finding container dc23e2b0deed5f652ff94018aab06ac118fd0191ee0fb1770bec1fed6c1a9a73: Status 404 returned error can't find the container with id dc23e2b0deed5f652ff94018aab06ac118fd0191ee0fb1770bec1fed6c1a9a73 Apr 24 23:56:16.968124 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:16.968102 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvs6p" Apr 24 23:56:17.122179 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.122145 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvs6p"] Apr 24 23:56:17.125312 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:17.125285 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e517a0_5f4e_40ee_b905_26216127aa87.slice/crio-9bfd15f5c4aea2699d67f4f5c6ff73c6af9c47d5adfb14d036005a7060e61829 WatchSource:0}: Error finding container 9bfd15f5c4aea2699d67f4f5c6ff73c6af9c47d5adfb14d036005a7060e61829: Status 404 returned error can't find the container with id 9bfd15f5c4aea2699d67f4f5c6ff73c6af9c47d5adfb14d036005a7060e61829 Apr 24 23:56:17.572817 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.572776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:17.573284 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.572839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:17.573683 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.573630 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-service-ca-bundle\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:17.576143 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.576117 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31-metrics-certs\") pod \"router-default-84b54f7d86-6hsq2\" (UID: \"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31\") " pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:17.674253 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.674201 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:56:17.679303 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.679276 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6db25df-5f50-488f-94b9-8d2c23f69077-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hsllp\" (UID: \"c6db25df-5f50-488f-94b9-8d2c23f69077\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:56:17.795388 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.795348 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:17.859333 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.859295 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvs6p" event={"ID":"97e517a0-5f4e-40ee-b905-26216127aa87","Type":"ContainerStarted","Data":"38ab0171f44dbea4ba0d5dfa9af0889e44d72943bf7e38764713a0d6c09800b5"} Apr 24 23:56:17.859461 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.859339 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvs6p" event={"ID":"97e517a0-5f4e-40ee-b905-26216127aa87","Type":"ContainerStarted","Data":"b692949a37d3b03bb98d8eabe1316b92858f2ff94d3acacc9edd4922ecbaee95"} Apr 24 23:56:17.859461 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.859354 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvs6p" event={"ID":"97e517a0-5f4e-40ee-b905-26216127aa87","Type":"ContainerStarted","Data":"9bfd15f5c4aea2699d67f4f5c6ff73c6af9c47d5adfb14d036005a7060e61829"} Apr 24 23:56:17.860753 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.860711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sg89f" event={"ID":"aa5c2105-93ed-4d18-af5a-1ffb27c236a3","Type":"ContainerStarted","Data":"dc23e2b0deed5f652ff94018aab06ac118fd0191ee0fb1770bec1fed6c1a9a73"} Apr 24 23:56:17.898561 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.898528 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" Apr 24 23:56:17.941649 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:17.941611 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84b54f7d86-6hsq2"] Apr 24 23:56:17.946310 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:17.946270 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc3cbbb_c17a_46c2_bec2_3461b9d1bf31.slice/crio-77b47b18daf47f797d9235ea263bf8b4b5824c0689cecdb33189dd5e72557bdc WatchSource:0}: Error finding container 77b47b18daf47f797d9235ea263bf8b4b5824c0689cecdb33189dd5e72557bdc: Status 404 returned error can't find the container with id 77b47b18daf47f797d9235ea263bf8b4b5824c0689cecdb33189dd5e72557bdc Apr 24 23:56:18.040158 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:18.040134 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp"] Apr 24 23:56:18.042874 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:18.042838 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6db25df_5f50_488f_94b9_8d2c23f69077.slice/crio-c26f6f886088c24216fc1ae21fc605dface24ee0c8a082e677369e0167be088a WatchSource:0}: Error finding container c26f6f886088c24216fc1ae21fc605dface24ee0c8a082e677369e0167be088a: Status 404 returned error can't find the container with id c26f6f886088c24216fc1ae21fc605dface24ee0c8a082e677369e0167be088a Apr 24 23:56:18.865612 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:18.865538 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" event={"ID":"c6db25df-5f50-488f-94b9-8d2c23f69077","Type":"ContainerStarted","Data":"c26f6f886088c24216fc1ae21fc605dface24ee0c8a082e677369e0167be088a"} Apr 24 23:56:18.868884 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:18.868810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" event={"ID":"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31","Type":"ContainerStarted","Data":"8472dea8f18b6304bf390a49f55800f14a5e322548de27c8361df834b5eef386"} Apr 24 23:56:18.868884 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:18.868856 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" event={"ID":"3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31","Type":"ContainerStarted","Data":"77b47b18daf47f797d9235ea263bf8b4b5824c0689cecdb33189dd5e72557bdc"} Apr 24 23:56:18.890299 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:18.889283 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" podStartSLOduration=33.889264232 podStartE2EDuration="33.889264232s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:18.888226788 +0000 UTC m=+179.017435067" watchObservedRunningTime="2026-04-24 23:56:18.889264232 +0000 UTC m=+179.018472511" Apr 24 23:56:19.795857 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:19.795809 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:19.798961 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:19.798919 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:19.871160 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:19.871074 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:19.872508 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:19.872485 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-84b54f7d86-6hsq2" Apr 24 23:56:20.875519 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.875473 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvs6p" event={"ID":"97e517a0-5f4e-40ee-b905-26216127aa87","Type":"ContainerStarted","Data":"5cfb0621b0b33dcac991a4b6ce9f59faf168e81c69c31da9d6b954a6bfc4a5d3"} Apr 24 23:56:20.876926 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.876901 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" event={"ID":"c6db25df-5f50-488f-94b9-8d2c23f69077","Type":"ContainerStarted","Data":"03524fc3f83cc9c89fc9bfd973c9146ef823cab03af06cd62e95d43318ae7b33"} Apr 24 23:56:20.894825 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.894763 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gvs6p" podStartSLOduration=2.2677730240000002 podStartE2EDuration="4.894745445s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:17.182399391 +0000 UTC m=+177.311607648" lastFinishedPulling="2026-04-24 23:56:19.809371795 +0000 UTC m=+179.938580069" observedRunningTime="2026-04-24 23:56:20.894023164 +0000 UTC m=+181.023231443" watchObservedRunningTime="2026-04-24 23:56:20.894745445 +0000 UTC m=+181.023954055" Apr 24 23:56:20.910317 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.910274 2566 patch_prober.go:28] interesting pod/image-registry-69dc66854d-xqpjn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 23:56:20.910480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.910345 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 23:56:20.913785 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:20.913740 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hsllp" podStartSLOduration=34.145748942 podStartE2EDuration="35.913726384s" podCreationTimestamp="2026-04-24 23:55:45 +0000 UTC" firstStartedPulling="2026-04-24 23:56:18.045005796 +0000 UTC m=+178.174214053" lastFinishedPulling="2026-04-24 23:56:19.812983225 +0000 UTC m=+179.942191495" observedRunningTime="2026-04-24 23:56:20.913195634 +0000 UTC m=+181.042403911" watchObservedRunningTime="2026-04-24 23:56:20.913726384 +0000 UTC m=+181.042934708" Apr 24 23:56:22.816900 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:22.816867 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:56:23.488272 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.488237 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vf7rk"] Apr 24 23:56:23.492893 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.492867 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.496434 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.496397 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 23:56:23.496648 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.496628 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:23.496760 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.496701 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zflwb\"" Apr 24 23:56:23.496832 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.496756 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 23:56:23.502122 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.502079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vf7rk"] Apr 24 23:56:23.630571 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.630528 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.630726 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.630580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.630726 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.630678 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm649\" (UniqueName: \"kubernetes.io/projected/88c50803-315c-421e-8a20-9c331d1c572e-kube-api-access-qm649\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.630726 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.630714 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88c50803-315c-421e-8a20-9c331d1c572e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.731463 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.731430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm649\" (UniqueName: \"kubernetes.io/projected/88c50803-315c-421e-8a20-9c331d1c572e-kube-api-access-qm649\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.731623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.731481 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88c50803-315c-421e-8a20-9c331d1c572e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.731623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.731538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.731623 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.731560 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.731780 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:23.731663 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 23:56:23.731780 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:56:23.731721 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls podName:88c50803-315c-421e-8a20-9c331d1c572e nodeName:}" failed. No retries permitted until 2026-04-24 23:56:24.231703403 +0000 UTC m=+184.360911670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-vf7rk" (UID: "88c50803-315c-421e-8a20-9c331d1c572e") : secret "prometheus-operator-tls" not found Apr 24 23:56:23.732329 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.732301 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88c50803-315c-421e-8a20-9c331d1c572e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.734167 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.734134 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:23.740507 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:23.740445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm649\" (UniqueName: \"kubernetes.io/projected/88c50803-315c-421e-8a20-9c331d1c572e-kube-api-access-qm649\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:24.236844 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:24.236800 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:24.239636 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:24.239555 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c50803-315c-421e-8a20-9c331d1c572e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vf7rk\" (UID: \"88c50803-315c-421e-8a20-9c331d1c572e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:24.404536 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:24.404505 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" Apr 24 23:56:24.536905 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:24.536865 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vf7rk"] Apr 24 23:56:24.541408 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:24.541382 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c50803_315c_421e_8a20_9c331d1c572e.slice/crio-61172828da8308af0d2f7890e14123b860dfde56dbf78b9e3b4fd2b00815d15a WatchSource:0}: Error finding container 61172828da8308af0d2f7890e14123b860dfde56dbf78b9e3b4fd2b00815d15a: Status 404 returned error can't find the container with id 61172828da8308af0d2f7890e14123b860dfde56dbf78b9e3b4fd2b00815d15a Apr 24 23:56:24.893273 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:24.893231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" event={"ID":"88c50803-315c-421e-8a20-9c331d1c572e","Type":"ContainerStarted","Data":"61172828da8308af0d2f7890e14123b860dfde56dbf78b9e3b4fd2b00815d15a"} Apr 24 23:56:31.747333 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.747298 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:56:31.750069 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.750042 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.752998 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.752973 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:56:31.753906 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.753883 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:56:31.754048 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.753946 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:56:31.754048 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.753947 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2sr25\"" Apr 24 23:56:31.754334 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.754317 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:56:31.755319 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.755296 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:56:31.760291 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.760256 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:56:31.763439 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.763418 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:56:31.902886 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.902850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.902886 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.902897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.903129 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.902926 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.903129 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.902999 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.903129 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.903017 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.903129 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.903037 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbsnk\" (UniqueName: \"kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:31.903129 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:31.903083 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003630 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003551 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003630 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003598 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003630 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003623 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003661 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003715 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbsnk\" (UniqueName: \"kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.003904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.003778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.004513 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.004461 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.004654 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.004609 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.004654 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.004634 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.004879 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.004856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.006447 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.006423 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.006551 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.006514 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.012922 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.012897 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbsnk\" (UniqueName: \"kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk\") pod \"console-9c7bb9bf7-l6tfg\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.062243 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.062199 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:32.407431 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.407402 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:56:32.410775 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:32.410753 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb907ed_fab4_488e_b464_1a797b12a706.slice/crio-2ff7c4c0c00f1fa4a7bd86b930b31457c8040075e90cfee7012fc30f6db40a58 WatchSource:0}: Error finding container 2ff7c4c0c00f1fa4a7bd86b930b31457c8040075e90cfee7012fc30f6db40a58: Status 404 returned error can't find the container with id 2ff7c4c0c00f1fa4a7bd86b930b31457c8040075e90cfee7012fc30f6db40a58 Apr 24 23:56:32.917968 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.917926 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-sg89f" event={"ID":"aa5c2105-93ed-4d18-af5a-1ffb27c236a3","Type":"ContainerStarted","Data":"6a33128448f443528a0ec4b6727c2eb860cea805cbb2cf5cef745c5c0892ede5"} Apr 24 23:56:32.918546 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.918501 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:32.921150 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.921119 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" event={"ID":"88c50803-315c-421e-8a20-9c331d1c572e","Type":"ContainerStarted","Data":"6828bd5b8c6d7532a6d6d039bfe6a7c7ae5d84911c5e74d057237445eaeedd12"} Apr 24 23:56:32.921287 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.921156 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" event={"ID":"88c50803-315c-421e-8a20-9c331d1c572e","Type":"ContainerStarted","Data":"883eba693400e15dcd819d6194553ccbc0e1db8064fa41233370cbcb596dca41"} Apr 24 23:56:32.923514 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.923481 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c7bb9bf7-l6tfg" event={"ID":"0cb907ed-fab4-488e-b464-1a797b12a706","Type":"ContainerStarted","Data":"2ff7c4c0c00f1fa4a7bd86b930b31457c8040075e90cfee7012fc30f6db40a58"} Apr 24 23:56:32.931800 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.931778 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-sg89f" Apr 24 23:56:32.937754 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.937699 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-sg89f" podStartSLOduration=1.570314623 podStartE2EDuration="16.937687019s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:16.952510853 +0000 UTC m=+177.081719111" lastFinishedPulling="2026-04-24 23:56:32.319883244 +0000 UTC m=+192.449091507" observedRunningTime="2026-04-24 23:56:32.936780776 +0000 UTC m=+193.065989091" watchObservedRunningTime="2026-04-24 23:56:32.937687019 +0000 UTC m=+193.066895297" Apr 24 23:56:32.978376 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:32.977290 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vf7rk" podStartSLOduration=2.246357905 podStartE2EDuration="9.97718712s" podCreationTimestamp="2026-04-24 23:56:23 +0000 UTC" firstStartedPulling="2026-04-24 23:56:24.543510407 +0000 UTC m=+184.672718667" lastFinishedPulling="2026-04-24 23:56:32.274339611 +0000 UTC m=+192.403547882" observedRunningTime="2026-04-24 23:56:32.975989354 +0000 UTC m=+193.105197634" watchObservedRunningTime="2026-04-24 23:56:32.97718712 +0000 UTC m=+193.106395401" Apr 24 23:56:34.995602 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:34.994168 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rg6lt"] Apr 24 23:56:35.056000 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.054561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.058341 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.057093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmz5f\"" Apr 24 23:56:35.060885 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.059918 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:35.060885 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.059987 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:35.060885 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.060198 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235019 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsd4\" (UniqueName: \"kubernetes.io/projected/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-kube-api-access-4bsd4\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235086 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-root\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235113 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-textfile\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235164 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-tls\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235186 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-metrics-client-ca\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235266 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235310 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-sys\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.235480 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.235340 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-wtmp\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336582 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-tls\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-metrics-client-ca\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336757 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-sys\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336787 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-wtmp\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsd4\" (UniqueName: \"kubernetes.io/projected/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-kube-api-access-4bsd4\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336859 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-root\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.336882 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-textfile\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.337633 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.337256 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-textfile\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.339347 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.338316 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-sys\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.339347 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.338539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-root\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.339347 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.338801 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-wtmp\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.339347 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.338818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-metrics-client-ca\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.339787 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.339750 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-accelerators-collector-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.346688 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.346578 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsd4\" (UniqueName: \"kubernetes.io/projected/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-kube-api-access-4bsd4\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.352069 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.351591 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-tls\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.357497 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.357425 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6a77d3d5-2582-4083-80ef-aac6fdfc51b2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rg6lt\" (UID: \"6a77d3d5-2582-4083-80ef-aac6fdfc51b2\") " pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.371221 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.371177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rg6lt" Apr 24 23:56:35.592732 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:35.592697 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a77d3d5_2582_4083_80ef_aac6fdfc51b2.slice/crio-29247c4f217d4485d5190c00cab5850b708f72e53971c44d193b86e04063f010 WatchSource:0}: Error finding container 29247c4f217d4485d5190c00cab5850b708f72e53971c44d193b86e04063f010: Status 404 returned error can't find the container with id 29247c4f217d4485d5190c00cab5850b708f72e53971c44d193b86e04063f010 Apr 24 23:56:35.936264 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:35.936161 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg6lt" event={"ID":"6a77d3d5-2582-4083-80ef-aac6fdfc51b2","Type":"ContainerStarted","Data":"29247c4f217d4485d5190c00cab5850b708f72e53971c44d193b86e04063f010"} Apr 24 23:56:36.941053 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:36.941010 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c7bb9bf7-l6tfg" event={"ID":"0cb907ed-fab4-488e-b464-1a797b12a706","Type":"ContainerStarted","Data":"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db"} Apr 24 23:56:36.959750 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:36.959706 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9c7bb9bf7-l6tfg" podStartSLOduration=2.327003657 podStartE2EDuration="5.959689198s" podCreationTimestamp="2026-04-24 23:56:31 +0000 UTC" firstStartedPulling="2026-04-24 23:56:32.412851643 +0000 UTC m=+192.542059900" lastFinishedPulling="2026-04-24 23:56:36.045537182 +0000 UTC m=+196.174745441" observedRunningTime="2026-04-24 23:56:36.958258024 +0000 UTC m=+197.087466303" watchObservedRunningTime="2026-04-24 23:56:36.959689198 +0000 UTC m=+197.088897478" Apr 24 23:56:37.946302 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:37.946261 2566 generic.go:358] "Generic (PLEG): container finished" podID="6a77d3d5-2582-4083-80ef-aac6fdfc51b2" containerID="e0107107d7ea9c6e1ce8b1639a53b874ec9f45f4af1296682286b8f75157a6d6" exitCode=0 Apr 24 23:56:37.947058 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:37.946820 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg6lt" event={"ID":"6a77d3d5-2582-4083-80ef-aac6fdfc51b2","Type":"ContainerDied","Data":"e0107107d7ea9c6e1ce8b1639a53b874ec9f45f4af1296682286b8f75157a6d6"} Apr 24 23:56:38.365363 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:38.365333 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:56:38.951880 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:38.951848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg6lt" event={"ID":"6a77d3d5-2582-4083-80ef-aac6fdfc51b2","Type":"ContainerStarted","Data":"830649255af694d8ed8eb481b5e3c30bc0fd8697e389bdd2b906651b6adea1e6"} Apr 24 23:56:38.951880 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:38.951886 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rg6lt" event={"ID":"6a77d3d5-2582-4083-80ef-aac6fdfc51b2","Type":"ContainerStarted","Data":"6e95ea8a9540cc456cbd1a14bc1c8cbbecee01ab5d659f46f41bb7ae4d3efaa9"} Apr 24 23:56:38.978816 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:38.978768 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rg6lt" podStartSLOduration=3.550939524 podStartE2EDuration="4.978753957s" podCreationTimestamp="2026-04-24 23:56:34 +0000 UTC" firstStartedPulling="2026-04-24 23:56:35.594861408 +0000 UTC m=+195.724069672" lastFinishedPulling="2026-04-24 23:56:37.022675844 +0000 UTC m=+197.151884105" observedRunningTime="2026-04-24 23:56:38.976785149 +0000 UTC m=+199.105993434" watchObservedRunningTime="2026-04-24 23:56:38.978753957 +0000 UTC m=+199.107962236" Apr 24 23:56:39.248667 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.248581 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-566d956db8-6q454"] Apr 24 23:56:39.275067 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.275036 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-566d956db8-6q454"] Apr 24 23:56:39.275261 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.275177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.277868 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.277842 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-209d0m9u514ss\"" Apr 24 23:56:39.279233 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.278991 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-7hgg6\"" Apr 24 23:56:39.279233 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.278994 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 23:56:39.279233 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.278994 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 23:56:39.279233 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.279076 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:56:39.279233 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.279000 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 23:56:39.372332 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372289 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-tls\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372517 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372342 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372517 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372398 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl2ds\" (UniqueName: \"kubernetes.io/projected/98aa683f-2faf-4e36-85f7-e01b1d0148bf-kube-api-access-bl2ds\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372517 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-metrics-server-audit-profiles\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372674 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372525 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/98aa683f-2faf-4e36-85f7-e01b1d0148bf-audit-log\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372674 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372564 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-client-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.372674 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.372634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-client-certs\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.473864 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.473806 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-metrics-server-audit-profiles\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474056 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.473912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/98aa683f-2faf-4e36-85f7-e01b1d0148bf-audit-log\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474056 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.473953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-client-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474056 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.473985 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-client-certs\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474056 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-tls\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474056 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474357 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl2ds\" (UniqueName: \"kubernetes.io/projected/98aa683f-2faf-4e36-85f7-e01b1d0148bf-kube-api-access-bl2ds\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474528 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/98aa683f-2faf-4e36-85f7-e01b1d0148bf-audit-log\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.474988 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.475107 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.474983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/98aa683f-2faf-4e36-85f7-e01b1d0148bf-metrics-server-audit-profiles\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.476923 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.476896 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-tls\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.477027 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.476932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-secret-metrics-server-client-certs\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.477027 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.477005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98aa683f-2faf-4e36-85f7-e01b1d0148bf-client-ca-bundle\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.485738 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.485715 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl2ds\" (UniqueName: \"kubernetes.io/projected/98aa683f-2faf-4e36-85f7-e01b1d0148bf-kube-api-access-bl2ds\") pod \"metrics-server-566d956db8-6q454\" (UID: \"98aa683f-2faf-4e36-85f7-e01b1d0148bf\") " pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.585876 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.585837 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:39.730966 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.730894 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-566d956db8-6q454"] Apr 24 23:56:39.736097 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:56:39.736064 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98aa683f_2faf_4e36_85f7_e01b1d0148bf.slice/crio-84ea2296d9a958d8bfc83b696234de4bc2ed68a3d451a1ba732696328893b0ac WatchSource:0}: Error finding container 84ea2296d9a958d8bfc83b696234de4bc2ed68a3d451a1ba732696328893b0ac: Status 404 returned error can't find the container with id 84ea2296d9a958d8bfc83b696234de4bc2ed68a3d451a1ba732696328893b0ac Apr 24 23:56:39.956654 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:39.956530 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-566d956db8-6q454" event={"ID":"98aa683f-2faf-4e36-85f7-e01b1d0148bf","Type":"ContainerStarted","Data":"84ea2296d9a958d8bfc83b696234de4bc2ed68a3d451a1ba732696328893b0ac"} Apr 24 23:56:41.531930 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:41.531893 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:56:41.964251 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:41.964186 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-566d956db8-6q454" event={"ID":"98aa683f-2faf-4e36-85f7-e01b1d0148bf","Type":"ContainerStarted","Data":"b6059ffeac60c690b6e7a891489377a21d05c7740a1fd7abb46ef5470435f9fa"} Apr 24 23:56:41.983260 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:41.983190 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-566d956db8-6q454" podStartSLOduration=0.912711781 podStartE2EDuration="2.983172929s" podCreationTimestamp="2026-04-24 23:56:39 +0000 UTC" firstStartedPulling="2026-04-24 23:56:39.738723866 +0000 UTC m=+199.867932125" lastFinishedPulling="2026-04-24 23:56:41.809185005 +0000 UTC m=+201.938393273" observedRunningTime="2026-04-24 23:56:41.981674287 +0000 UTC m=+202.110882589" watchObservedRunningTime="2026-04-24 23:56:41.983172929 +0000 UTC m=+202.112381210" Apr 24 23:56:42.062455 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:42.062363 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:56:59.586950 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:59.586908 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:56:59.586950 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:56:59.586958 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:57:03.387306 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.387261 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerName="registry" containerID="cri-o://0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05" gracePeriod=30 Apr 24 23:57:03.644190 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.644135 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:57:03.792458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792426 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9qh\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792458 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792465 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792704 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792483 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792704 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792623 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792704 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792693 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792850 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792733 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792850 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792766 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.792850 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.792805 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration\") pod \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\" (UID: \"a185de46-a1c8-4c54-b1dc-d4928ab48ce4\") " Apr 24 23:57:03.793166 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.793133 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:03.793904 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.793349 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:03.794947 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.794925 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:03.795252 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.795196 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh" (OuterVolumeSpecName: "kube-api-access-ng9qh") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "kube-api-access-ng9qh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:03.795350 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.795318 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:03.795408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.795344 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:03.795499 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.795482 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:03.803009 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.802982 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a185de46-a1c8-4c54-b1dc-d4928ab48ce4" (UID: "a185de46-a1c8-4c54-b1dc-d4928ab48ce4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:03.893504 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893469 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-ca-trust-extracted\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893504 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893502 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-bound-sa-token\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893504 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893514 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-certificates\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893666 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893524 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-installation-pull-secrets\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893666 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893534 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-image-registry-private-configuration\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893666 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893544 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ng9qh\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-kube-api-access-ng9qh\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893666 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893552 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-registry-tls\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:03.893666 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:03.893561 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a185de46-a1c8-4c54-b1dc-d4928ab48ce4-trusted-ca\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:04.024561 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.024473 2566 generic.go:358] "Generic (PLEG): container finished" podID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerID="0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05" exitCode=0 Apr 24 23:57:04.024561 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.024537 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" Apr 24 23:57:04.024798 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.024553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" event={"ID":"a185de46-a1c8-4c54-b1dc-d4928ab48ce4","Type":"ContainerDied","Data":"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05"} Apr 24 23:57:04.024798 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.024596 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69dc66854d-xqpjn" event={"ID":"a185de46-a1c8-4c54-b1dc-d4928ab48ce4","Type":"ContainerDied","Data":"3b1a524e07785e8c01d0bc69ca77cf5edd0fc17ad9f934f1397ed14406f69aa7"} Apr 24 23:57:04.024798 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.024614 2566 scope.go:117] "RemoveContainer" containerID="0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05" Apr 24 23:57:04.032526 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.032506 2566 scope.go:117] "RemoveContainer" containerID="0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05" Apr 24 23:57:04.032784 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:57:04.032766 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05\": container with ID starting with 0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05 not found: ID does not exist" containerID="0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05" Apr 24 23:57:04.032836 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.032790 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05"} err="failed to get container status \"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05\": rpc error: code = NotFound desc = could not find container \"0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05\": container with ID starting with 0e15cbd42ad0319e9ca04066e40cf471d988593ea32f31ea35b3ffb85ba45d05 not found: ID does not exist" Apr 24 23:57:04.045612 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.045584 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:57:04.049419 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.049395 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69dc66854d-xqpjn"] Apr 24 23:57:04.403216 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:04.403168 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" path="/var/lib/kubelet/pods/a185de46-a1c8-4c54-b1dc-d4928ab48ce4/volumes" Apr 24 23:57:06.554120 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.554057 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9c7bb9bf7-l6tfg" podUID="0cb907ed-fab4-488e-b464-1a797b12a706" containerName="console" containerID="cri-o://1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db" gracePeriod=15 Apr 24 23:57:06.841890 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.841870 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c7bb9bf7-l6tfg_0cb907ed-fab4-488e-b464-1a797b12a706/console/0.log" Apr 24 23:57:06.842007 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.841924 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:57:06.918196 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918168 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbsnk\" (UniqueName: \"kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918227 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918249 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918268 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918309 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918352 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918323 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918595 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918356 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config\") pod \"0cb907ed-fab4-488e-b464-1a797b12a706\" (UID: \"0cb907ed-fab4-488e-b464-1a797b12a706\") " Apr 24 23:57:06.918768 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918742 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca" (OuterVolumeSpecName: "service-ca") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:06.918768 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918751 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:06.918866 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918767 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config" (OuterVolumeSpecName: "console-config") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:06.918866 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.918777 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:06.920436 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.920416 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:06.920512 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.920471 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk" (OuterVolumeSpecName: "kube-api-access-zbsnk") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "kube-api-access-zbsnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:06.920559 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:06.920545 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0cb907ed-fab4-488e-b464-1a797b12a706" (UID: "0cb907ed-fab4-488e-b464-1a797b12a706"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:07.019112 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019067 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-service-ca\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019112 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019100 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-oauth-serving-cert\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019112 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019115 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-oauth-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019366 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019128 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbsnk\" (UniqueName: \"kubernetes.io/projected/0cb907ed-fab4-488e-b464-1a797b12a706-kube-api-access-zbsnk\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019366 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019141 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-console-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019366 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019152 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb907ed-fab4-488e-b464-1a797b12a706-console-serving-cert\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.019366 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.019163 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb907ed-fab4-488e-b464-1a797b12a706-trusted-ca-bundle\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 24 23:57:07.036527 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036502 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c7bb9bf7-l6tfg_0cb907ed-fab4-488e-b464-1a797b12a706/console/0.log" Apr 24 23:57:07.036662 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036540 2566 generic.go:358] "Generic (PLEG): container finished" podID="0cb907ed-fab4-488e-b464-1a797b12a706" containerID="1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db" exitCode=2 Apr 24 23:57:07.036662 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036568 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c7bb9bf7-l6tfg" event={"ID":"0cb907ed-fab4-488e-b464-1a797b12a706","Type":"ContainerDied","Data":"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db"} Apr 24 23:57:07.036662 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c7bb9bf7-l6tfg" event={"ID":"0cb907ed-fab4-488e-b464-1a797b12a706","Type":"ContainerDied","Data":"2ff7c4c0c00f1fa4a7bd86b930b31457c8040075e90cfee7012fc30f6db40a58"} Apr 24 23:57:07.036662 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036615 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c7bb9bf7-l6tfg" Apr 24 23:57:07.036785 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.036618 2566 scope.go:117] "RemoveContainer" containerID="1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db" Apr 24 23:57:07.044400 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.044378 2566 scope.go:117] "RemoveContainer" containerID="1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db" Apr 24 23:57:07.044687 ip-10-0-139-62 kubenswrapper[2566]: E0424 23:57:07.044665 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db\": container with ID starting with 1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db not found: ID does not exist" containerID="1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db" Apr 24 23:57:07.044775 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.044696 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db"} err="failed to get container status \"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db\": rpc error: code = NotFound desc = could not find container \"1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db\": container with ID starting with 1b34704032b93e9a07b4103f73c098c80bf8da782daea319598316fb57cfd1db not found: ID does not exist" Apr 24 23:57:07.058849 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.058829 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:57:07.062295 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:07.062276 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9c7bb9bf7-l6tfg"] Apr 24 23:57:08.403408 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:08.403378 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb907ed-fab4-488e-b464-1a797b12a706" path="/var/lib/kubelet/pods/0cb907ed-fab4-488e-b464-1a797b12a706/volumes" Apr 24 23:57:19.591824 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:19.591783 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:57:19.595665 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:19.595639 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-566d956db8-6q454" Apr 24 23:57:32.212549 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:32.212436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:57:32.214663 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:32.214641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9f062da-f1a8-4e5a-ac2f-ad672791353b-metrics-certs\") pod \"network-metrics-daemon-c6pqs\" (UID: \"f9f062da-f1a8-4e5a-ac2f-ad672791353b\") " pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:57:32.504247 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:32.504140 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:57:32.512417 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:32.512397 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c6pqs" Apr 24 23:57:32.628261 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:32.628229 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c6pqs"] Apr 24 23:57:32.631626 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:57:32.631594 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f062da_f1a8_4e5a_ac2f_ad672791353b.slice/crio-f53b61baa68786506c5d9431f2ef990c2b9cf0f1d8a4b9b7667a618fd1be0162 WatchSource:0}: Error finding container f53b61baa68786506c5d9431f2ef990c2b9cf0f1d8a4b9b7667a618fd1be0162: Status 404 returned error can't find the container with id f53b61baa68786506c5d9431f2ef990c2b9cf0f1d8a4b9b7667a618fd1be0162 Apr 24 23:57:33.102919 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:33.102880 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6pqs" event={"ID":"f9f062da-f1a8-4e5a-ac2f-ad672791353b","Type":"ContainerStarted","Data":"f53b61baa68786506c5d9431f2ef990c2b9cf0f1d8a4b9b7667a618fd1be0162"} Apr 24 23:57:34.106786 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:34.106747 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6pqs" event={"ID":"f9f062da-f1a8-4e5a-ac2f-ad672791353b","Type":"ContainerStarted","Data":"0ec6f1cbe4c6105eb2e06915d5573ce3be74d5ac46861142c86b09a367db936d"} Apr 24 23:57:34.106786 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:34.106790 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c6pqs" event={"ID":"f9f062da-f1a8-4e5a-ac2f-ad672791353b","Type":"ContainerStarted","Data":"afe8f65a6e00902becb8f8ddc727be6bb62005cf6150caf0d7bbb0fb208f55c1"} Apr 24 23:57:34.124954 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:57:34.124904 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c6pqs" podStartSLOduration=253.211747842 podStartE2EDuration="4m14.124889656s" podCreationTimestamp="2026-04-24 23:53:20 +0000 UTC" firstStartedPulling="2026-04-24 23:57:32.633355107 +0000 UTC m=+252.762563364" lastFinishedPulling="2026-04-24 23:57:33.546496917 +0000 UTC m=+253.675705178" observedRunningTime="2026-04-24 23:57:34.124066048 +0000 UTC m=+254.253274337" watchObservedRunningTime="2026-04-24 23:57:34.124889656 +0000 UTC m=+254.254097933" Apr 24 23:58:20.348932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:58:20.348904 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:58:20.349431 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:58:20.348903 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 24 23:58:20.354434 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:58:20.354396 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:59:10.329659 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329567 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329890 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cb907ed-fab4-488e-b464-1a797b12a706" containerName="console" Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329906 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb907ed-fab4-488e-b464-1a797b12a706" containerName="console" Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329922 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerName="registry" Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329927 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerName="registry" Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329975 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a185de46-a1c8-4c54-b1dc-d4928ab48ce4" containerName="registry" Apr 24 23:59:10.330082 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.329984 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cb907ed-fab4-488e-b464-1a797b12a706" containerName="console" Apr 24 23:59:10.332698 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.332680 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.335288 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.335268 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 23:59:10.336252 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.336226 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 23:59:10.336355 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.336292 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 23:59:10.336461 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.336445 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 23:59:10.336527 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.336485 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 23:59:10.336527 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.336489 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2sr25\"" Apr 24 23:59:10.340594 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.340573 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 23:59:10.344166 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.344146 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 24 23:59:10.418760 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418732 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.418932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418769 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cm64\" (UniqueName: \"kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.418932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418795 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.418932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418850 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.418932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418879 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.418932 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.419106 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.418963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.519857 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.519825 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.519857 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.519864 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520063 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.519963 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cm64\" (UniqueName: \"kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520063 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.519987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520063 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520010 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520063 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520063 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520646 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520762 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520663 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520910 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520888 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.520948 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.520923 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.522440 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.522413 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.522522 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.522494 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.528422 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.528402 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cm64\" (UniqueName: \"kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64\") pod \"console-8456455dd-qqx7s\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.642234 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.642147 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:10.752220 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.752184 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 24 23:59:10.754681 ip-10-0-139-62 kubenswrapper[2566]: W0424 23:59:10.754644 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb107eb5_5562_4916_9b2a_5fbc8c067a4c.slice/crio-4bdfb455116474b994ca15210ee1f7f2feb03f8616aef13986145fb8981d28f4 WatchSource:0}: Error finding container 4bdfb455116474b994ca15210ee1f7f2feb03f8616aef13986145fb8981d28f4: Status 404 returned error can't find the container with id 4bdfb455116474b994ca15210ee1f7f2feb03f8616aef13986145fb8981d28f4 Apr 24 23:59:10.756511 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:10.756494 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:59:11.374158 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:11.374120 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8456455dd-qqx7s" event={"ID":"bb107eb5-5562-4916-9b2a-5fbc8c067a4c","Type":"ContainerStarted","Data":"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96"} Apr 24 23:59:11.374158 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:11.374155 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8456455dd-qqx7s" event={"ID":"bb107eb5-5562-4916-9b2a-5fbc8c067a4c","Type":"ContainerStarted","Data":"4bdfb455116474b994ca15210ee1f7f2feb03f8616aef13986145fb8981d28f4"} Apr 24 23:59:11.394481 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:11.394437 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8456455dd-qqx7s" podStartSLOduration=1.394422214 podStartE2EDuration="1.394422214s" podCreationTimestamp="2026-04-24 23:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:59:11.392650155 +0000 UTC m=+351.521858428" watchObservedRunningTime="2026-04-24 23:59:11.394422214 +0000 UTC m=+351.523630493" Apr 24 23:59:20.642483 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:20.642432 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:20.642483 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:20.642477 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:20.647405 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:20.647380 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8456455dd-qqx7s" Apr 24 23:59:21.407428 ip-10-0-139-62 kubenswrapper[2566]: I0424 23:59:21.407393 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8456455dd-qqx7s" Apr 25 00:00:00.157630 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.154708 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29617920-dnclw"] Apr 25 00:00:00.158952 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.158926 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.161706 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.161688 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-vbkd9\"" Apr 25 00:00:00.161706 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.161694 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 25 00:00:00.166101 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.166080 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-dnclw"] Apr 25 00:00:00.287569 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.287540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znbzq\" (UniqueName: \"kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.287682 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.287590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.388978 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.388933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.389167 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.389045 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znbzq\" (UniqueName: \"kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.389652 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.389628 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.397983 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.397953 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znbzq\" (UniqueName: \"kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq\") pod \"image-pruner-29617920-dnclw\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.489487 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.489453 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:00.605325 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:00.605275 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29617920-dnclw"] Apr 25 00:00:00.608805 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:00:00.608774 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8570f506_620d_4a4f_8a35_f62c0a517524.slice/crio-36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa WatchSource:0}: Error finding container 36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa: Status 404 returned error can't find the container with id 36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa Apr 25 00:00:01.501974 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:01.501937 2566 generic.go:358] "Generic (PLEG): container finished" podID="8570f506-620d-4a4f-8a35-f62c0a517524" containerID="cbda50079fffaec8b499b8f349c38ad3817750d30a75bcce9b13ae33089062a6" exitCode=0 Apr 25 00:00:01.502367 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:01.501991 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-dnclw" event={"ID":"8570f506-620d-4a4f-8a35-f62c0a517524","Type":"ContainerDied","Data":"cbda50079fffaec8b499b8f349c38ad3817750d30a75bcce9b13ae33089062a6"} Apr 25 00:00:01.502367 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:01.502016 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-dnclw" event={"ID":"8570f506-620d-4a4f-8a35-f62c0a517524","Type":"ContainerStarted","Data":"36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa"} Apr 25 00:00:02.623482 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.623456 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:00:02.706618 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.706577 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znbzq\" (UniqueName: \"kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq\") pod \"8570f506-620d-4a4f-8a35-f62c0a517524\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " Apr 25 00:00:02.706812 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.706635 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca\") pod \"8570f506-620d-4a4f-8a35-f62c0a517524\" (UID: \"8570f506-620d-4a4f-8a35-f62c0a517524\") " Apr 25 00:00:02.707012 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.706985 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca" (OuterVolumeSpecName: "serviceca") pod "8570f506-620d-4a4f-8a35-f62c0a517524" (UID: "8570f506-620d-4a4f-8a35-f62c0a517524"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:00:02.708778 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.708744 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq" (OuterVolumeSpecName: "kube-api-access-znbzq") pod "8570f506-620d-4a4f-8a35-f62c0a517524" (UID: "8570f506-620d-4a4f-8a35-f62c0a517524"). InnerVolumeSpecName "kube-api-access-znbzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:00:02.807690 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.807655 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-znbzq\" (UniqueName: \"kubernetes.io/projected/8570f506-620d-4a4f-8a35-f62c0a517524-kube-api-access-znbzq\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:00:02.807690 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:02.807686 2566 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8570f506-620d-4a4f-8a35-f62c0a517524-serviceca\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:00:03.508508 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:03.508475 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29617920-dnclw" event={"ID":"8570f506-620d-4a4f-8a35-f62c0a517524","Type":"ContainerDied","Data":"36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa"} Apr 25 00:00:03.508508 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:03.508510 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36bd331f82fe969b61e2b4db83ce727b3c3cbd4cf97e0fa90cc67d8cf17ea3fa" Apr 25 00:00:03.508508 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:00:03.508487 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29617920-dnclw" Apr 25 00:01:16.434489 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.434451 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2"] Apr 25 00:01:16.435007 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.434854 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8570f506-620d-4a4f-8a35-f62c0a517524" containerName="image-pruner" Apr 25 00:01:16.435007 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.434874 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8570f506-620d-4a4f-8a35-f62c0a517524" containerName="image-pruner" Apr 25 00:01:16.435007 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.434923 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8570f506-620d-4a4f-8a35-f62c0a517524" containerName="image-pruner" Apr 25 00:01:16.437600 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.437580 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.440085 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.440063 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-c42hg\"" Apr 25 00:01:16.440309 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.440293 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 25 00:01:16.440723 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.440710 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 25 00:01:16.440896 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.440883 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 25 00:01:16.461068 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.461040 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2"] Apr 25 00:01:16.556596 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.556554 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.556596 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.556593 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndvm\" (UniqueName: \"kubernetes.io/projected/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-kube-api-access-cndvm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.657155 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.657126 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.657368 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.657165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cndvm\" (UniqueName: \"kubernetes.io/projected/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-kube-api-access-cndvm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.659420 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.659401 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.673879 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.673856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndvm\" (UniqueName: \"kubernetes.io/projected/64ebca1d-c0da-4fcc-86bd-c1fa7401e457-kube-api-access-cndvm\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2\" (UID: \"64ebca1d-c0da-4fcc-86bd-c1fa7401e457\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.747551 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.747471 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:16.872714 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:16.867359 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2"] Apr 25 00:01:17.707978 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:17.707938 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" event={"ID":"64ebca1d-c0da-4fcc-86bd-c1fa7401e457","Type":"ContainerStarted","Data":"9bf3f093d1b6fd81df3cdc64c27d63fe439bf09ea0967f3ce04c8fdd8f23bf3e"} Apr 25 00:01:20.718027 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:20.717992 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" event={"ID":"64ebca1d-c0da-4fcc-86bd-c1fa7401e457","Type":"ContainerStarted","Data":"973f73e76128ddc024d94e25328e88833aafe923421414ff8e7b91509abc6eee"} Apr 25 00:01:20.718449 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:20.718052 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:01:20.735539 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:20.735473 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" podStartSLOduration=1.077101813 podStartE2EDuration="4.735459473s" podCreationTimestamp="2026-04-25 00:01:16 +0000 UTC" firstStartedPulling="2026-04-25 00:01:16.873099427 +0000 UTC m=+477.002307688" lastFinishedPulling="2026-04-25 00:01:20.531457087 +0000 UTC m=+480.660665348" observedRunningTime="2026-04-25 00:01:20.73417815 +0000 UTC m=+480.863386454" watchObservedRunningTime="2026-04-25 00:01:20.735459473 +0000 UTC m=+480.864667753" Apr 25 00:01:21.401388 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.401195 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9"] Apr 25 00:01:21.404583 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.404555 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.407557 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.407529 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 25 00:01:21.407678 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.407577 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p5vmd\"" Apr 25 00:01:21.407827 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.407812 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 25 00:01:21.417177 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.417148 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9"] Apr 25 00:01:21.501566 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.501522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.501751 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.501597 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gc8\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-kube-api-access-d5gc8\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.501751 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.501659 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.602183 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.602143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.602242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gc8\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-kube-api-access-d5gc8\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.602270 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:21.602310 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:21.602332 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:21.602352 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9: references non-existent secret key: tls.crt Apr 25 00:01:21.602403 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:21.602407 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates podName:61bcdd00-b7c0-403f-9fc5-325ac01ba1f2 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:22.102392226 +0000 UTC m=+482.231600484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates") pod "keda-metrics-apiserver-7c9f485588-4x8x9" (UID: "61bcdd00-b7c0-403f-9fc5-325ac01ba1f2") : references non-existent secret key: tls.crt Apr 25 00:01:21.602625 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.602595 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:21.611128 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:21.611101 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gc8\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-kube-api-access-d5gc8\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:22.105943 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:22.105893 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:22.106417 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:22.106051 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:01:22.106417 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:22.106067 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:01:22.106417 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:22.106090 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9: references non-existent secret key: tls.crt Apr 25 00:01:22.106417 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:22.106145 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates podName:61bcdd00-b7c0-403f-9fc5-325ac01ba1f2 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:23.106130878 +0000 UTC m=+483.235339136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates") pod "keda-metrics-apiserver-7c9f485588-4x8x9" (UID: "61bcdd00-b7c0-403f-9fc5-325ac01ba1f2") : references non-existent secret key: tls.crt Apr 25 00:01:23.113565 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:23.113525 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:23.113986 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:23.113647 2566 secret.go:281] references non-existent secret key: tls.crt Apr 25 00:01:23.113986 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:23.113660 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 25 00:01:23.113986 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:23.113677 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9: references non-existent secret key: tls.crt Apr 25 00:01:23.113986 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:01:23.113729 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates podName:61bcdd00-b7c0-403f-9fc5-325ac01ba1f2 nodeName:}" failed. No retries permitted until 2026-04-25 00:01:25.113715677 +0000 UTC m=+485.242923935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates") pod "keda-metrics-apiserver-7c9f485588-4x8x9" (UID: "61bcdd00-b7c0-403f-9fc5-325ac01ba1f2") : references non-existent secret key: tls.crt Apr 25 00:01:25.135052 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:25.135018 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:25.137737 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:25.137714 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61bcdd00-b7c0-403f-9fc5-325ac01ba1f2-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4x8x9\" (UID: \"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:25.314833 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:25.314792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:25.430185 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:25.430161 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9"] Apr 25 00:01:25.432888 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:01:25.432862 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61bcdd00_b7c0_403f_9fc5_325ac01ba1f2.slice/crio-18d558da19bc7f5aa2f721031e65406a990ecb97d5d21f4ac3640212e0cad48f WatchSource:0}: Error finding container 18d558da19bc7f5aa2f721031e65406a990ecb97d5d21f4ac3640212e0cad48f: Status 404 returned error can't find the container with id 18d558da19bc7f5aa2f721031e65406a990ecb97d5d21f4ac3640212e0cad48f Apr 25 00:01:25.732550 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:25.732458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" event={"ID":"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2","Type":"ContainerStarted","Data":"18d558da19bc7f5aa2f721031e65406a990ecb97d5d21f4ac3640212e0cad48f"} Apr 25 00:01:28.743581 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:28.743541 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" event={"ID":"61bcdd00-b7c0-403f-9fc5-325ac01ba1f2","Type":"ContainerStarted","Data":"ae6d70db29081854b2c6aec51c49929916b5589e759f42cd84612d295d643d49"} Apr 25 00:01:28.744025 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:28.743672 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:28.762567 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:28.762511 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" podStartSLOduration=4.983834092 podStartE2EDuration="7.76249548s" podCreationTimestamp="2026-04-25 00:01:21 +0000 UTC" firstStartedPulling="2026-04-25 00:01:25.434169253 +0000 UTC m=+485.563377512" lastFinishedPulling="2026-04-25 00:01:28.212830639 +0000 UTC m=+488.342038900" observedRunningTime="2026-04-25 00:01:28.761684283 +0000 UTC m=+488.890892564" watchObservedRunningTime="2026-04-25 00:01:28.76249548 +0000 UTC m=+488.891703768" Apr 25 00:01:39.750758 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:39.750728 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4x8x9" Apr 25 00:01:41.723592 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:01:41.723564 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2f6m2" Apr 25 00:02:27.532626 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.532540 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:02:27.535062 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.535042 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-njnrk"] Apr 25 00:02:27.535197 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.535180 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.537240 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.537219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.537703 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.537679 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 25 00:02:27.537820 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.537710 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 25 00:02:27.537820 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.537765 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-v59k2\"" Apr 25 00:02:27.537940 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.537833 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 25 00:02:27.539519 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.539500 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 25 00:02:27.539681 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.539666 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hwrv2\"" Apr 25 00:02:27.545424 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.545375 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:02:27.561936 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.561907 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-njnrk"] Apr 25 00:02:27.637802 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.637771 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjw5k\" (UniqueName: \"kubernetes.io/projected/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-kube-api-access-cjw5k\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.638001 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.637831 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.638001 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.637853 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.638001 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.637943 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkw4\" (UniqueName: \"kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.739035 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.739000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.739252 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.739068 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkw4\" (UniqueName: \"kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.739252 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.739097 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjw5k\" (UniqueName: \"kubernetes.io/projected/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-kube-api-access-cjw5k\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.739252 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.739165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.741544 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.741514 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.741683 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.741545 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.747727 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.747698 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkw4\" (UniqueName: \"kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4\") pod \"kserve-controller-manager-64c4d9588d-p5frp\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.747849 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.747792 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjw5k\" (UniqueName: \"kubernetes.io/projected/51d00fe8-d244-4e1b-b5cc-ad0affd91ea5-kube-api-access-cjw5k\") pod \"llmisvc-controller-manager-68cc5db7c4-njnrk\" (UID: \"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.849277 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.849150 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:27.854989 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.854961 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:27.975388 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.975356 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:02:27.977419 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:02:27.977389 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c4e51de_7450_4be4_832f_f9a1fc78dc20.slice/crio-680e39e0af5ea3672ca75fd8f686f10dc91afd2c2d4e647e47093d0b3009d2a6 WatchSource:0}: Error finding container 680e39e0af5ea3672ca75fd8f686f10dc91afd2c2d4e647e47093d0b3009d2a6: Status 404 returned error can't find the container with id 680e39e0af5ea3672ca75fd8f686f10dc91afd2c2d4e647e47093d0b3009d2a6 Apr 25 00:02:27.994432 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:27.994409 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-njnrk"] Apr 25 00:02:27.996632 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:02:27.996604 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod51d00fe8_d244_4e1b_b5cc_ad0affd91ea5.slice/crio-3ffb1e5148a8edb49b9c32b8ad110ba80ed3c2c94a8c622bce0e494736f2b227 WatchSource:0}: Error finding container 3ffb1e5148a8edb49b9c32b8ad110ba80ed3c2c94a8c622bce0e494736f2b227: Status 404 returned error can't find the container with id 3ffb1e5148a8edb49b9c32b8ad110ba80ed3c2c94a8c622bce0e494736f2b227 Apr 25 00:02:28.911771 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:28.911730 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" event={"ID":"6c4e51de-7450-4be4-832f-f9a1fc78dc20","Type":"ContainerStarted","Data":"680e39e0af5ea3672ca75fd8f686f10dc91afd2c2d4e647e47093d0b3009d2a6"} Apr 25 00:02:28.913552 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:28.913324 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" event={"ID":"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5","Type":"ContainerStarted","Data":"3ffb1e5148a8edb49b9c32b8ad110ba80ed3c2c94a8c622bce0e494736f2b227"} Apr 25 00:02:31.925084 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.925037 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" event={"ID":"51d00fe8-d244-4e1b-b5cc-ad0affd91ea5","Type":"ContainerStarted","Data":"628cff9805deaac7e9b28f445522fbff85007d779bbf9f3a89ff9d4ccfc49c5c"} Apr 25 00:02:31.925570 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.925141 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:02:31.926350 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.926323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" event={"ID":"6c4e51de-7450-4be4-832f-f9a1fc78dc20","Type":"ContainerStarted","Data":"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6"} Apr 25 00:02:31.926487 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.926473 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:02:31.944792 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.944749 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" podStartSLOduration=1.982810384 podStartE2EDuration="4.944733642s" podCreationTimestamp="2026-04-25 00:02:27 +0000 UTC" firstStartedPulling="2026-04-25 00:02:27.998130199 +0000 UTC m=+548.127338460" lastFinishedPulling="2026-04-25 00:02:30.960053446 +0000 UTC m=+551.089261718" observedRunningTime="2026-04-25 00:02:31.943299992 +0000 UTC m=+552.072508284" watchObservedRunningTime="2026-04-25 00:02:31.944733642 +0000 UTC m=+552.073941921" Apr 25 00:02:31.958894 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:02:31.958841 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" podStartSLOduration=1.976079613 podStartE2EDuration="4.958825719s" podCreationTimestamp="2026-04-25 00:02:27 +0000 UTC" firstStartedPulling="2026-04-25 00:02:27.978703619 +0000 UTC m=+548.107911876" lastFinishedPulling="2026-04-25 00:02:30.961449724 +0000 UTC m=+551.090657982" observedRunningTime="2026-04-25 00:02:31.958662722 +0000 UTC m=+552.087871003" watchObservedRunningTime="2026-04-25 00:02:31.958825719 +0000 UTC m=+552.088034000" Apr 25 00:03:02.931744 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:02.931709 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-njnrk" Apr 25 00:03:02.934733 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:02.934708 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:03:04.396374 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.396340 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:03:04.396781 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.396575 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" podUID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" containerName="manager" containerID="cri-o://0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6" gracePeriod=10 Apr 25 00:03:04.419778 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.419746 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ddv76"] Apr 25 00:03:04.470094 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.470067 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ddv76"] Apr 25 00:03:04.470201 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.470192 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.542581 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.542538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdwd\" (UniqueName: \"kubernetes.io/projected/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-kube-api-access-vtdwd\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.542801 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.542774 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-cert\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.644402 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.644332 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdwd\" (UniqueName: \"kubernetes.io/projected/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-kube-api-access-vtdwd\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.644502 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.644457 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-cert\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.646724 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.646673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-cert\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.652957 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.652927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdwd\" (UniqueName: \"kubernetes.io/projected/687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b-kube-api-access-vtdwd\") pod \"kserve-controller-manager-64c4d9588d-ddv76\" (UID: \"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b\") " pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.663178 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.663160 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:03:04.745390 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.745354 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert\") pod \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " Apr 25 00:03:04.745545 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.745440 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbkw4\" (UniqueName: \"kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4\") pod \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\" (UID: \"6c4e51de-7450-4be4-832f-f9a1fc78dc20\") " Apr 25 00:03:04.747460 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.747431 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert" (OuterVolumeSpecName: "cert") pod "6c4e51de-7450-4be4-832f-f9a1fc78dc20" (UID: "6c4e51de-7450-4be4-832f-f9a1fc78dc20"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:04.747571 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.747455 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4" (OuterVolumeSpecName: "kube-api-access-kbkw4") pod "6c4e51de-7450-4be4-832f-f9a1fc78dc20" (UID: "6c4e51de-7450-4be4-832f-f9a1fc78dc20"). InnerVolumeSpecName "kube-api-access-kbkw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:04.833950 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.833917 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:04.846976 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.846944 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c4e51de-7450-4be4-832f-f9a1fc78dc20-cert\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:04.846976 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.846977 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbkw4\" (UniqueName: \"kubernetes.io/projected/6c4e51de-7450-4be4-832f-f9a1fc78dc20-kube-api-access-kbkw4\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:04.981551 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:04.981518 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-ddv76"] Apr 25 00:03:04.984552 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:03:04.984509 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687a38ca_26ee_4f6b_8d3e_26a0edaa1e1b.slice/crio-a156081fb418b4ad40c084fca7baad548914e0353763d93df656fec8150fa9e5 WatchSource:0}: Error finding container a156081fb418b4ad40c084fca7baad548914e0353763d93df656fec8150fa9e5: Status 404 returned error can't find the container with id a156081fb418b4ad40c084fca7baad548914e0353763d93df656fec8150fa9e5 Apr 25 00:03:05.035164 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.035127 2566 generic.go:358] "Generic (PLEG): container finished" podID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" containerID="0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6" exitCode=0 Apr 25 00:03:05.035364 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.035200 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" Apr 25 00:03:05.035364 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.035230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" event={"ID":"6c4e51de-7450-4be4-832f-f9a1fc78dc20","Type":"ContainerDied","Data":"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6"} Apr 25 00:03:05.035364 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.035265 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-p5frp" event={"ID":"6c4e51de-7450-4be4-832f-f9a1fc78dc20","Type":"ContainerDied","Data":"680e39e0af5ea3672ca75fd8f686f10dc91afd2c2d4e647e47093d0b3009d2a6"} Apr 25 00:03:05.035364 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.035281 2566 scope.go:117] "RemoveContainer" containerID="0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6" Apr 25 00:03:05.036419 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.036399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" event={"ID":"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b","Type":"ContainerStarted","Data":"a156081fb418b4ad40c084fca7baad548914e0353763d93df656fec8150fa9e5"} Apr 25 00:03:05.043489 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.043467 2566 scope.go:117] "RemoveContainer" containerID="0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6" Apr 25 00:03:05.043765 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:03:05.043741 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6\": container with ID starting with 0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6 not found: ID does not exist" containerID="0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6" Apr 25 00:03:05.043821 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.043778 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6"} err="failed to get container status \"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6\": rpc error: code = NotFound desc = could not find container \"0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6\": container with ID starting with 0199bc06e6d3c9b59c0747812a73dac8ef52b5982d80c56d66b696139049b4e6 not found: ID does not exist" Apr 25 00:03:05.056093 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.056068 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:03:05.066664 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:05.066639 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-64c4d9588d-p5frp"] Apr 25 00:03:06.041149 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:06.041109 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" event={"ID":"687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b","Type":"ContainerStarted","Data":"4be9c006f363d4cb0fa57061e6fa933b0e88e5c19c24da63777ef77589026ce9"} Apr 25 00:03:06.041583 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:06.041201 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:06.058644 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:06.058595 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" podStartSLOduration=1.486451501 podStartE2EDuration="2.058578658s" podCreationTimestamp="2026-04-25 00:03:04 +0000 UTC" firstStartedPulling="2026-04-25 00:03:04.985859959 +0000 UTC m=+585.115068218" lastFinishedPulling="2026-04-25 00:03:05.557987113 +0000 UTC m=+585.687195375" observedRunningTime="2026-04-25 00:03:06.056702591 +0000 UTC m=+586.185910870" watchObservedRunningTime="2026-04-25 00:03:06.058578658 +0000 UTC m=+586.187786939" Apr 25 00:03:06.404790 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:06.404692 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" path="/var/lib/kubelet/pods/6c4e51de-7450-4be4-832f-f9a1fc78dc20/volumes" Apr 25 00:03:15.420517 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:15.420476 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 25 00:03:20.368860 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:20.368831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:03:20.369326 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:20.368839 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:03:37.050343 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:37.050261 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-64c4d9588d-ddv76" Apr 25 00:03:41.094277 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.094187 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8456455dd-qqx7s" podUID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" containerName="console" containerID="cri-o://c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96" gracePeriod=15 Apr 25 00:03:41.334120 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.334100 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8456455dd-qqx7s_bb107eb5-5562-4916-9b2a-5fbc8c067a4c/console/0.log" Apr 25 00:03:41.334234 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.334162 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8456455dd-qqx7s" Apr 25 00:03:41.432242 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432136 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cm64\" (UniqueName: \"kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432242 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432171 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432242 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432200 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432242 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432240 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432562 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432358 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432562 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432401 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432562 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432468 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca\") pod \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\" (UID: \"bb107eb5-5562-4916-9b2a-5fbc8c067a4c\") " Apr 25 00:03:41.432722 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432676 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:41.432774 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432712 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config" (OuterVolumeSpecName: "console-config") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:41.432841 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432799 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:41.432965 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.432944 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca" (OuterVolumeSpecName: "service-ca") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:03:41.434438 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.434410 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:41.434536 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.434461 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64" (OuterVolumeSpecName: "kube-api-access-2cm64") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "kube-api-access-2cm64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:03:41.434536 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.434465 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bb107eb5-5562-4916-9b2a-5fbc8c067a4c" (UID: "bb107eb5-5562-4916-9b2a-5fbc8c067a4c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:03:41.534021 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.533963 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-service-ca\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534021 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534006 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cm64\" (UniqueName: \"kubernetes.io/projected/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-kube-api-access-2cm64\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534021 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534018 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-oauth-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534021 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534026 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534021 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534035 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-trusted-ca-bundle\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534331 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534044 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-oauth-serving-cert\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:41.534331 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:41.534053 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb107eb5-5562-4916-9b2a-5fbc8c067a4c-console-serving-cert\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:03:42.156332 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8456455dd-qqx7s_bb107eb5-5562-4916-9b2a-5fbc8c067a4c/console/0.log" Apr 25 00:03:42.156738 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156340 2566 generic.go:358] "Generic (PLEG): container finished" podID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" containerID="c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96" exitCode=2 Apr 25 00:03:42.156738 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156413 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8456455dd-qqx7s" event={"ID":"bb107eb5-5562-4916-9b2a-5fbc8c067a4c","Type":"ContainerDied","Data":"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96"} Apr 25 00:03:42.156738 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156433 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8456455dd-qqx7s" Apr 25 00:03:42.156738 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156449 2566 scope.go:117] "RemoveContainer" containerID="c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96" Apr 25 00:03:42.156738 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.156439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8456455dd-qqx7s" event={"ID":"bb107eb5-5562-4916-9b2a-5fbc8c067a4c","Type":"ContainerDied","Data":"4bdfb455116474b994ca15210ee1f7f2feb03f8616aef13986145fb8981d28f4"} Apr 25 00:03:42.165012 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.164989 2566 scope.go:117] "RemoveContainer" containerID="c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96" Apr 25 00:03:42.165310 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:03:42.165287 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96\": container with ID starting with c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96 not found: ID does not exist" containerID="c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96" Apr 25 00:03:42.165395 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.165313 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96"} err="failed to get container status \"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96\": rpc error: code = NotFound desc = could not find container \"c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96\": container with ID starting with c9aa8909ac028dd26e85f1b209d3b33572e7c1625e220834cc7bf8d77c3dad96 not found: ID does not exist" Apr 25 00:03:42.182731 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.182707 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 25 00:03:42.185881 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.185858 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8456455dd-qqx7s"] Apr 25 00:03:42.403439 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:03:42.403404 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" path="/var/lib/kubelet/pods/bb107eb5-5562-4916-9b2a-5fbc8c067a4c/volumes" Apr 25 00:05:58.763377 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763342 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763640 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" containerName="console" Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763652 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" containerName="console" Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763663 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" containerName="manager" Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763669 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" containerName="manager" Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763714 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb107eb5-5562-4916-9b2a-5fbc8c067a4c" containerName="console" Apr 25 00:05:58.763926 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.763721 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c4e51de-7450-4be4-832f-f9a1fc78dc20" containerName="manager" Apr 25 00:05:58.767006 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.766980 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.769425 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.769401 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\"" Apr 25 00:05:58.769557 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.769434 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 25 00:05:58.769557 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.769508 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-3b499-predictor-serving-cert\"" Apr 25 00:05:58.770423 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.770407 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 25 00:05:58.770495 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.770413 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gz2jj\"" Apr 25 00:05:58.774820 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.774776 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:05:58.897457 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.897421 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.897655 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.897483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hszq5\" (UniqueName: \"kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.897655 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.897522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.897655 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.897580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.998859 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.998805 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.999034 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.998968 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.999034 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.998998 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hszq5\" (UniqueName: \"kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.999162 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.999132 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.999425 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.999238 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:58.999668 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:58.999647 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:59.001478 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.001460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:59.006347 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.006325 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hszq5\" (UniqueName: \"kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5\") pod \"isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:59.079584 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.079547 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:05:59.207264 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.207235 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:05:59.211346 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:05:59.211298 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505acb2b_854c_4ba7_a80f_68509f5fee55.slice/crio-31801a07e3f50c930fdf889eb475a583432bd35e37ef2002c5a10feebcf8c593 WatchSource:0}: Error finding container 31801a07e3f50c930fdf889eb475a583432bd35e37ef2002c5a10feebcf8c593: Status 404 returned error can't find the container with id 31801a07e3f50c930fdf889eb475a583432bd35e37ef2002c5a10feebcf8c593 Apr 25 00:05:59.213162 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.213143 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:05:59.568048 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:05:59.568006 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerStarted","Data":"31801a07e3f50c930fdf889eb475a583432bd35e37ef2002c5a10feebcf8c593"} Apr 25 00:06:02.578768 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:02.578684 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerStarted","Data":"e49c532603b48583df8d384bdb10b3c50fe2722360661ff5faf9e9ef3bc76c39"} Apr 25 00:06:06.592528 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:06.592441 2566 generic.go:358] "Generic (PLEG): container finished" podID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerID="e49c532603b48583df8d384bdb10b3c50fe2722360661ff5faf9e9ef3bc76c39" exitCode=0 Apr 25 00:06:06.592528 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:06.592513 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerDied","Data":"e49c532603b48583df8d384bdb10b3c50fe2722360661ff5faf9e9ef3bc76c39"} Apr 25 00:06:26.662594 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:26.662538 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerStarted","Data":"d5c9cc041486de24e4d0f6b1f5bfaa16aa228987130f9049210f801d438ae77f"} Apr 25 00:06:28.670517 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:28.670482 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerStarted","Data":"1b12f89285b0e7b12fbe28ef86bc7a75b06b39b503ed7706e8ff639512d7c49b"} Apr 25 00:06:28.670897 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:28.670660 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:06:28.706714 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:28.706663 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podStartSLOduration=1.72209484 podStartE2EDuration="30.70665051s" podCreationTimestamp="2026-04-25 00:05:58 +0000 UTC" firstStartedPulling="2026-04-25 00:05:59.213310053 +0000 UTC m=+759.342518312" lastFinishedPulling="2026-04-25 00:06:28.197865721 +0000 UTC m=+788.327073982" observedRunningTime="2026-04-25 00:06:28.704398215 +0000 UTC m=+788.833606506" watchObservedRunningTime="2026-04-25 00:06:28.70665051 +0000 UTC m=+788.835858789" Apr 25 00:06:29.673871 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:29.673838 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:06:29.675193 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:29.675164 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:06:30.677119 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:30.677079 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:06:35.681638 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:35.681563 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:06:35.682148 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:35.682118 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:06:45.682722 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:45.682681 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:06:55.682110 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:06:55.682072 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:07:05.682109 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:05.682063 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:07:15.682090 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:15.682047 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:07:25.682424 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:25.682384 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 25 00:07:35.682878 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:35.682846 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:07:49.097133 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.097094 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:07:49.097683 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.097452 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" containerID="cri-o://d5c9cc041486de24e4d0f6b1f5bfaa16aa228987130f9049210f801d438ae77f" gracePeriod=30 Apr 25 00:07:49.097683 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.097477 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kube-rbac-proxy" containerID="cri-o://1b12f89285b0e7b12fbe28ef86bc7a75b06b39b503ed7706e8ff639512d7c49b" gracePeriod=30 Apr 25 00:07:49.122431 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.122386 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:07:49.126587 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.126561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.129005 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.128974 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-serving-cert\"" Apr 25 00:07:49.129152 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.129070 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\"" Apr 25 00:07:49.136830 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.136803 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:07:49.280215 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.280152 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.280418 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.280275 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.280418 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.280314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvpr\" (UniqueName: \"kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.280418 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.280365 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.381685 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.381579 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.381685 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.381637 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnvpr\" (UniqueName: \"kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.381927 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.381689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.381927 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.381728 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.382128 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.382103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.382442 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.382419 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.384061 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.384040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.391423 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.391396 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnvpr\" (UniqueName: \"kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr\") pod \"isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.441862 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.441794 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:49.570660 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.570632 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:07:49.573264 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:07:49.573222 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9c96ff_0e6a_4265_8a48_450f1fdf5501.slice/crio-2f309c22710706cdc45e30932335d99ddc05bd14a4ca54413329e52cc75230ec WatchSource:0}: Error finding container 2f309c22710706cdc45e30932335d99ddc05bd14a4ca54413329e52cc75230ec: Status 404 returned error can't find the container with id 2f309c22710706cdc45e30932335d99ddc05bd14a4ca54413329e52cc75230ec Apr 25 00:07:49.928486 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.928451 2566 generic.go:358] "Generic (PLEG): container finished" podID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerID="1b12f89285b0e7b12fbe28ef86bc7a75b06b39b503ed7706e8ff639512d7c49b" exitCode=2 Apr 25 00:07:49.928658 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.928517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerDied","Data":"1b12f89285b0e7b12fbe28ef86bc7a75b06b39b503ed7706e8ff639512d7c49b"} Apr 25 00:07:49.929783 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.929747 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerStarted","Data":"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af"} Apr 25 00:07:49.929783 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:49.929784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerStarted","Data":"2f309c22710706cdc45e30932335d99ddc05bd14a4ca54413329e52cc75230ec"} Apr 25 00:07:50.677762 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:50.677714 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 25 00:07:52.942603 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:52.942561 2566 generic.go:358] "Generic (PLEG): container finished" podID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerID="d5c9cc041486de24e4d0f6b1f5bfaa16aa228987130f9049210f801d438ae77f" exitCode=0 Apr 25 00:07:52.942944 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:52.942638 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerDied","Data":"d5c9cc041486de24e4d0f6b1f5bfaa16aa228987130f9049210f801d438ae77f"} Apr 25 00:07:53.042193 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.042167 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:07:53.215819 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.215709 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hszq5\" (UniqueName: \"kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5\") pod \"505acb2b-854c-4ba7-a80f-68509f5fee55\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " Apr 25 00:07:53.215819 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.215781 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls\") pod \"505acb2b-854c-4ba7-a80f-68509f5fee55\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " Apr 25 00:07:53.215819 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.215810 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location\") pod \"505acb2b-854c-4ba7-a80f-68509f5fee55\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " Apr 25 00:07:53.216112 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.215932 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\") pod \"505acb2b-854c-4ba7-a80f-68509f5fee55\" (UID: \"505acb2b-854c-4ba7-a80f-68509f5fee55\") " Apr 25 00:07:53.216112 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.216100 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "505acb2b-854c-4ba7-a80f-68509f5fee55" (UID: "505acb2b-854c-4ba7-a80f-68509f5fee55"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:07:53.216238 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.216199 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/505acb2b-854c-4ba7-a80f-68509f5fee55-kserve-provision-location\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.216323 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.216303 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config") pod "505acb2b-854c-4ba7-a80f-68509f5fee55" (UID: "505acb2b-854c-4ba7-a80f-68509f5fee55"). InnerVolumeSpecName "isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:07:53.218045 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.218022 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "505acb2b-854c-4ba7-a80f-68509f5fee55" (UID: "505acb2b-854c-4ba7-a80f-68509f5fee55"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:07:53.218139 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.218062 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5" (OuterVolumeSpecName: "kube-api-access-hszq5") pod "505acb2b-854c-4ba7-a80f-68509f5fee55" (UID: "505acb2b-854c-4ba7-a80f-68509f5fee55"). InnerVolumeSpecName "kube-api-access-hszq5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:07:53.317321 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.317280 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/505acb2b-854c-4ba7-a80f-68509f5fee55-isvc-xgboost-graph-raw-3b499-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.317321 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.317313 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hszq5\" (UniqueName: \"kubernetes.io/projected/505acb2b-854c-4ba7-a80f-68509f5fee55-kube-api-access-hszq5\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.317321 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.317323 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/505acb2b-854c-4ba7-a80f-68509f5fee55-proxy-tls\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:07:53.947749 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.947711 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" event={"ID":"505acb2b-854c-4ba7-a80f-68509f5fee55","Type":"ContainerDied","Data":"31801a07e3f50c930fdf889eb475a583432bd35e37ef2002c5a10feebcf8c593"} Apr 25 00:07:53.947749 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.947756 2566 scope.go:117] "RemoveContainer" containerID="1b12f89285b0e7b12fbe28ef86bc7a75b06b39b503ed7706e8ff639512d7c49b" Apr 25 00:07:53.948273 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.947764 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7" Apr 25 00:07:53.955133 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.955098 2566 generic.go:358] "Generic (PLEG): container finished" podID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerID="c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af" exitCode=0 Apr 25 00:07:53.955272 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.955169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerDied","Data":"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af"} Apr 25 00:07:53.961531 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.961512 2566 scope.go:117] "RemoveContainer" containerID="d5c9cc041486de24e4d0f6b1f5bfaa16aa228987130f9049210f801d438ae77f" Apr 25 00:07:53.969118 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.969096 2566 scope.go:117] "RemoveContainer" containerID="e49c532603b48583df8d384bdb10b3c50fe2722360661ff5faf9e9ef3bc76c39" Apr 25 00:07:53.984282 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.984255 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:07:53.988542 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:53.988515 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-3b499-predictor-f94984f49-pzpv7"] Apr 25 00:07:54.404006 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.403967 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" path="/var/lib/kubelet/pods/505acb2b-854c-4ba7-a80f-68509f5fee55/volumes" Apr 25 00:07:54.961085 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.961042 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerStarted","Data":"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795"} Apr 25 00:07:54.961085 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.961090 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerStarted","Data":"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a"} Apr 25 00:07:54.961626 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.961383 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:54.961626 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.961417 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:07:54.962871 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.962839 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:07:54.980338 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:54.980266 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podStartSLOduration=5.980247878 podStartE2EDuration="5.980247878s" podCreationTimestamp="2026-04-25 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:07:54.97896378 +0000 UTC m=+875.108172061" watchObservedRunningTime="2026-04-25 00:07:54.980247878 +0000 UTC m=+875.109456158" Apr 25 00:07:55.966463 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:07:55.966423 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:00.971628 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:00.971595 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:08:00.972259 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:00.972200 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:10.973128 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:10.973084 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:20.388055 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:20.388023 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:08:20.388740 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:20.388719 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:08:20.972437 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:20.972394 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:30.972151 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:30.972104 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:40.972109 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:40.972072 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:08:50.972862 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:08:50.972816 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:09:00.973649 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:00.973607 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:09:29.309250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309198 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309663 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="storage-initializer" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309687 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="storage-initializer" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309712 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309721 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309731 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kube-rbac-proxy" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309740 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kube-rbac-proxy" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309819 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kube-rbac-proxy" Apr 25 00:09:29.310016 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.309835 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="505acb2b-854c-4ba7-a80f-68509f5fee55" containerName="kserve-container" Apr 25 00:09:29.313220 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.313180 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.316660 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.316638 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-b8788-predictor-serving-cert\"" Apr 25 00:09:29.316909 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.316651 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\"" Apr 25 00:09:29.325426 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.325400 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:09:29.337269 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.337198 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:09:29.337709 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.337676 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" containerID="cri-o://78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a" gracePeriod=30 Apr 25 00:09:29.337944 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.337758 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kube-rbac-proxy" containerID="cri-o://333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795" gracePeriod=30 Apr 25 00:09:29.369171 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.369133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76gqw\" (UniqueName: \"kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.369367 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.369191 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.369367 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.369283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.470141 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.470105 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76gqw\" (UniqueName: \"kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.470315 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.470151 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.470315 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.470180 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.470938 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.470918 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.472727 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.472702 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.478172 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.478141 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76gqw\" (UniqueName: \"kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw\") pod \"message-dumper-raw-b8788-predictor-85565b99f9-hckmm\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.627247 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.627119 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:29.748007 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:29.747983 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:09:29.750352 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:09:29.750320 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fc01df_c19f_4457_9b79_e1aaa951822d.slice/crio-54bc2349fbd7ac7a689e94083e9a37aabbf545b994c07d51fbbe8301f94250cd WatchSource:0}: Error finding container 54bc2349fbd7ac7a689e94083e9a37aabbf545b994c07d51fbbe8301f94250cd: Status 404 returned error can't find the container with id 54bc2349fbd7ac7a689e94083e9a37aabbf545b994c07d51fbbe8301f94250cd Apr 25 00:09:30.269816 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:30.269781 2566 generic.go:358] "Generic (PLEG): container finished" podID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerID="333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795" exitCode=2 Apr 25 00:09:30.269970 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:30.269848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerDied","Data":"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795"} Apr 25 00:09:30.270848 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:30.270825 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerStarted","Data":"54bc2349fbd7ac7a689e94083e9a37aabbf545b994c07d51fbbe8301f94250cd"} Apr 25 00:09:30.967015 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:30.966968 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 25 00:09:30.972265 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:30.972237 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 25 00:09:31.276243 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.276108 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerStarted","Data":"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f"} Apr 25 00:09:31.276243 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.276144 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerStarted","Data":"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5"} Apr 25 00:09:31.276443 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.276343 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:31.276492 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.276462 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:31.278255 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.278235 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:09:31.294157 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:31.294109 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" podStartSLOduration=1.189025774 podStartE2EDuration="2.294096396s" podCreationTimestamp="2026-04-25 00:09:29 +0000 UTC" firstStartedPulling="2026-04-25 00:09:29.752351851 +0000 UTC m=+969.881560109" lastFinishedPulling="2026-04-25 00:09:30.857422466 +0000 UTC m=+970.986630731" observedRunningTime="2026-04-25 00:09:31.292719074 +0000 UTC m=+971.421927368" watchObservedRunningTime="2026-04-25 00:09:31.294096396 +0000 UTC m=+971.423304676" Apr 25 00:09:33.180967 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.180943 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:09:33.284231 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.284171 2566 generic.go:358] "Generic (PLEG): container finished" podID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerID="78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a" exitCode=0 Apr 25 00:09:33.284412 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.284262 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerDied","Data":"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a"} Apr 25 00:09:33.284412 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.284286 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" Apr 25 00:09:33.284412 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.284308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb" event={"ID":"4d9c96ff-0e6a-4265-8a48-450f1fdf5501","Type":"ContainerDied","Data":"2f309c22710706cdc45e30932335d99ddc05bd14a4ca54413329e52cc75230ec"} Apr 25 00:09:33.284412 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.284328 2566 scope.go:117] "RemoveContainer" containerID="333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795" Apr 25 00:09:33.291858 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.291836 2566 scope.go:117] "RemoveContainer" containerID="78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a" Apr 25 00:09:33.299272 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.299250 2566 scope.go:117] "RemoveContainer" containerID="c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af" Apr 25 00:09:33.301474 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301444 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls\") pod \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " Apr 25 00:09:33.301576 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301510 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") pod \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " Apr 25 00:09:33.301576 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301552 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location\") pod \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " Apr 25 00:09:33.301653 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301627 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnvpr\" (UniqueName: \"kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr\") pod \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\" (UID: \"4d9c96ff-0e6a-4265-8a48-450f1fdf5501\") " Apr 25 00:09:33.301899 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301861 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config") pod "4d9c96ff-0e6a-4265-8a48-450f1fdf5501" (UID: "4d9c96ff-0e6a-4265-8a48-450f1fdf5501"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:09:33.301899 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.301888 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d9c96ff-0e6a-4265-8a48-450f1fdf5501" (UID: "4d9c96ff-0e6a-4265-8a48-450f1fdf5501"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:09:33.303537 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.303516 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d9c96ff-0e6a-4265-8a48-450f1fdf5501" (UID: "4d9c96ff-0e6a-4265-8a48-450f1fdf5501"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:09:33.303729 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.303706 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr" (OuterVolumeSpecName: "kube-api-access-nnvpr") pod "4d9c96ff-0e6a-4265-8a48-450f1fdf5501" (UID: "4d9c96ff-0e6a-4265-8a48-450f1fdf5501"). InnerVolumeSpecName "kube-api-access-nnvpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:09:33.307487 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.307468 2566 scope.go:117] "RemoveContainer" containerID="333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795" Apr 25 00:09:33.307799 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:09:33.307773 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795\": container with ID starting with 333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795 not found: ID does not exist" containerID="333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795" Apr 25 00:09:33.307851 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.307808 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795"} err="failed to get container status \"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795\": rpc error: code = NotFound desc = could not find container \"333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795\": container with ID starting with 333b77ce88f20dd0c2b1e2a5d6a8ad570e923f114fe26363731a5481dad78795 not found: ID does not exist" Apr 25 00:09:33.307851 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.307826 2566 scope.go:117] "RemoveContainer" containerID="78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a" Apr 25 00:09:33.308064 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:09:33.308050 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a\": container with ID starting with 78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a not found: ID does not exist" containerID="78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a" Apr 25 00:09:33.308108 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.308066 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a"} err="failed to get container status \"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a\": rpc error: code = NotFound desc = could not find container \"78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a\": container with ID starting with 78b149d8a8869efc5b3a7fa26834796cea04a78a5c34c1dad55557dcab9a181a not found: ID does not exist" Apr 25 00:09:33.308108 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.308078 2566 scope.go:117] "RemoveContainer" containerID="c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af" Apr 25 00:09:33.308331 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:09:33.308312 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af\": container with ID starting with c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af not found: ID does not exist" containerID="c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af" Apr 25 00:09:33.308385 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.308335 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af"} err="failed to get container status \"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af\": rpc error: code = NotFound desc = could not find container \"c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af\": container with ID starting with c7e8beedd63586329cb431eaa692097184371c7640a871822731db5a19c475af not found: ID does not exist" Apr 25 00:09:33.402793 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.402753 2566 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-isvc-xgboost-graph-raw-hpa-3f0c4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.402793 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.402790 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kserve-provision-location\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.402793 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.402800 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnvpr\" (UniqueName: \"kubernetes.io/projected/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-kube-api-access-nnvpr\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.403031 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.402810 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d9c96ff-0e6a-4265-8a48-450f1fdf5501-proxy-tls\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:09:33.606153 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.606115 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:09:33.612053 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:33.612022 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3f0c4-predictor-76f85bb77c-2s7tb"] Apr 25 00:09:34.404531 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:34.404491 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" path="/var/lib/kubelet/pods/4d9c96ff-0e6a-4265-8a48-450f1fdf5501/volumes" Apr 25 00:09:38.288692 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:09:38.288664 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:11:04.381438 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:04.381355 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-b8788-predictor-85565b99f9-hckmm_b8fc01df-c19f-4457-9b79-e1aaa951822d/kserve-container/0.log" Apr 25 00:11:04.651107 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:04.651014 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:11:04.651432 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:04.651362 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kube-rbac-proxy" containerID="cri-o://1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" gracePeriod=30 Apr 25 00:11:04.651701 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:04.651405 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kserve-container" containerID="cri-o://2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" gracePeriod=30 Apr 25 00:11:04.905888 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:04.905822 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:11:05.044822 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.044776 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config\") pod \"b8fc01df-c19f-4457-9b79-e1aaa951822d\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " Apr 25 00:11:05.045004 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.044877 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls\") pod \"b8fc01df-c19f-4457-9b79-e1aaa951822d\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " Apr 25 00:11:05.045004 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.044913 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76gqw\" (UniqueName: \"kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw\") pod \"b8fc01df-c19f-4457-9b79-e1aaa951822d\" (UID: \"b8fc01df-c19f-4457-9b79-e1aaa951822d\") " Apr 25 00:11:05.045272 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.045201 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-b8788-kube-rbac-proxy-sar-config") pod "b8fc01df-c19f-4457-9b79-e1aaa951822d" (UID: "b8fc01df-c19f-4457-9b79-e1aaa951822d"). InnerVolumeSpecName "message-dumper-raw-b8788-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 25 00:11:05.046939 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.046908 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b8fc01df-c19f-4457-9b79-e1aaa951822d" (UID: "b8fc01df-c19f-4457-9b79-e1aaa951822d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 25 00:11:05.047354 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.047332 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw" (OuterVolumeSpecName: "kube-api-access-76gqw") pod "b8fc01df-c19f-4457-9b79-e1aaa951822d" (UID: "b8fc01df-c19f-4457-9b79-e1aaa951822d"). InnerVolumeSpecName "kube-api-access-76gqw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:11:05.146017 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.145975 2566 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-b8788-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b8fc01df-c19f-4457-9b79-e1aaa951822d-message-dumper-raw-b8788-kube-rbac-proxy-sar-config\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:11:05.146017 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.146013 2566 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8fc01df-c19f-4457-9b79-e1aaa951822d-proxy-tls\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:11:05.146257 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.146029 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-76gqw\" (UniqueName: \"kubernetes.io/projected/b8fc01df-c19f-4457-9b79-e1aaa951822d-kube-api-access-76gqw\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:11:05.574829 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574796 2566 generic.go:358] "Generic (PLEG): container finished" podID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerID="1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" exitCode=2 Apr 25 00:11:05.574829 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574822 2566 generic.go:358] "Generic (PLEG): container finished" podID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerID="2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" exitCode=2 Apr 25 00:11:05.575295 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574864 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" Apr 25 00:11:05.575295 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574879 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerDied","Data":"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f"} Apr 25 00:11:05.575295 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerDied","Data":"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5"} Apr 25 00:11:05.575295 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574929 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm" event={"ID":"b8fc01df-c19f-4457-9b79-e1aaa951822d","Type":"ContainerDied","Data":"54bc2349fbd7ac7a689e94083e9a37aabbf545b994c07d51fbbe8301f94250cd"} Apr 25 00:11:05.575295 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.574946 2566 scope.go:117] "RemoveContainer" containerID="1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" Apr 25 00:11:05.583143 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.583122 2566 scope.go:117] "RemoveContainer" containerID="2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" Apr 25 00:11:05.590234 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590195 2566 scope.go:117] "RemoveContainer" containerID="1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" Apr 25 00:11:05.590484 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:11:05.590465 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f\": container with ID starting with 1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f not found: ID does not exist" containerID="1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" Apr 25 00:11:05.590550 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590495 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f"} err="failed to get container status \"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f\": rpc error: code = NotFound desc = could not find container \"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f\": container with ID starting with 1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f not found: ID does not exist" Apr 25 00:11:05.590550 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590513 2566 scope.go:117] "RemoveContainer" containerID="2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" Apr 25 00:11:05.590737 ip-10-0-139-62 kubenswrapper[2566]: E0425 00:11:05.590718 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5\": container with ID starting with 2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5 not found: ID does not exist" containerID="2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" Apr 25 00:11:05.590788 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590743 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5"} err="failed to get container status \"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5\": rpc error: code = NotFound desc = could not find container \"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5\": container with ID starting with 2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5 not found: ID does not exist" Apr 25 00:11:05.590788 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590759 2566 scope.go:117] "RemoveContainer" containerID="1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f" Apr 25 00:11:05.590977 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590960 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f"} err="failed to get container status \"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f\": rpc error: code = NotFound desc = could not find container \"1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f\": container with ID starting with 1e63a495d41d6f25f3ee5aeccf1573c843bc1ba0824fd97df6a67378c5d33c1f not found: ID does not exist" Apr 25 00:11:05.590977 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.590977 2566 scope.go:117] "RemoveContainer" containerID="2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5" Apr 25 00:11:05.591174 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.591151 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5"} err="failed to get container status \"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5\": rpc error: code = NotFound desc = could not find container \"2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5\": container with ID starting with 2ac664f3fc5f1d1b9700f9bed83edd1a8b2ebc17c02a9062243faaaf6f1615f5 not found: ID does not exist" Apr 25 00:11:05.596530 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.596507 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:11:05.599702 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:05.599682 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b8788-predictor-85565b99f9-hckmm"] Apr 25 00:11:06.407589 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:11:06.407544 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" path="/var/lib/kubelet/pods/b8fc01df-c19f-4457-9b79-e1aaa951822d/volumes" Apr 25 00:13:20.406721 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:13:20.406694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:13:20.408662 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:13:20.408635 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:18:12.671728 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.671688 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mqmcl/must-gather-5ldxt"] Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672000 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kserve-container" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672012 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kserve-container" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672024 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672030 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672039 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672045 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672054 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="storage-initializer" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672059 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="storage-initializer" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672068 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672073 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672116 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672125 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kube-rbac-proxy" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672133 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d9c96ff-0e6a-4265-8a48-450f1fdf5501" containerName="kserve-container" Apr 25 00:18:12.672250 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.672140 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8fc01df-c19f-4457-9b79-e1aaa951822d" containerName="kserve-container" Apr 25 00:18:12.675218 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.675180 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.677544 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.677521 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mqmcl\"/\"kube-root-ca.crt\"" Apr 25 00:18:12.678500 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.678479 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mqmcl\"/\"openshift-service-ca.crt\"" Apr 25 00:18:12.678597 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.678548 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mqmcl\"/\"default-dockercfg-2z8c6\"" Apr 25 00:18:12.682404 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.682380 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mqmcl/must-gather-5ldxt"] Apr 25 00:18:12.710451 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.710412 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.710644 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.710479 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5hl\" (UniqueName: \"kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.811552 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.811513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5hl\" (UniqueName: \"kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.811715 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.811677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.811979 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.811961 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.819342 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.819309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5hl\" (UniqueName: \"kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl\") pod \"must-gather-5ldxt\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:12.996611 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:12.996499 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:13.123868 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:13.123835 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mqmcl/must-gather-5ldxt"] Apr 25 00:18:13.126448 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:18:13.126419 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f3037d_df72_4a33_b4fd_cc32b03b81f4.slice/crio-b752b695684bf8f516a5b2629c5aadf1fdae3443ae5ebbdd549f3b44fb41725d WatchSource:0}: Error finding container b752b695684bf8f516a5b2629c5aadf1fdae3443ae5ebbdd549f3b44fb41725d: Status 404 returned error can't find the container with id b752b695684bf8f516a5b2629c5aadf1fdae3443ae5ebbdd549f3b44fb41725d Apr 25 00:18:13.127995 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:13.127969 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:18:13.895050 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:13.894987 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" event={"ID":"41f3037d-df72-4a33-b4fd-cc32b03b81f4","Type":"ContainerStarted","Data":"b752b695684bf8f516a5b2629c5aadf1fdae3443ae5ebbdd549f3b44fb41725d"} Apr 25 00:18:18.914023 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:18.913981 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" event={"ID":"41f3037d-df72-4a33-b4fd-cc32b03b81f4","Type":"ContainerStarted","Data":"963ffaf0945811be8de5d4d2b0cb49e9bebaec6c887c56d8246eeaa3d51b77b8"} Apr 25 00:18:18.914023 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:18.914030 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" event={"ID":"41f3037d-df72-4a33-b4fd-cc32b03b81f4","Type":"ContainerStarted","Data":"d7d0014579b7f17b650b10bf17142b0c25909a4ab9920e5767df1c9c38cf06b5"} Apr 25 00:18:18.930119 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:18.930063 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" podStartSLOduration=1.6809516169999998 podStartE2EDuration="6.930046746s" podCreationTimestamp="2026-04-25 00:18:12 +0000 UTC" firstStartedPulling="2026-04-25 00:18:13.128094699 +0000 UTC m=+1493.257302960" lastFinishedPulling="2026-04-25 00:18:18.377189817 +0000 UTC m=+1498.506398089" observedRunningTime="2026-04-25 00:18:18.928568242 +0000 UTC m=+1499.057776536" watchObservedRunningTime="2026-04-25 00:18:18.930046746 +0000 UTC m=+1499.059255026" Apr 25 00:18:20.437326 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:20.437297 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:18:20.437788 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:20.437355 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:18:35.970476 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:35.970441 2566 generic.go:358] "Generic (PLEG): container finished" podID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerID="d7d0014579b7f17b650b10bf17142b0c25909a4ab9920e5767df1c9c38cf06b5" exitCode=0 Apr 25 00:18:35.970882 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:35.970488 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" event={"ID":"41f3037d-df72-4a33-b4fd-cc32b03b81f4","Type":"ContainerDied","Data":"d7d0014579b7f17b650b10bf17142b0c25909a4ab9920e5767df1c9c38cf06b5"} Apr 25 00:18:35.970882 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:35.970772 2566 scope.go:117] "RemoveContainer" containerID="d7d0014579b7f17b650b10bf17142b0c25909a4ab9920e5767df1c9c38cf06b5" Apr 25 00:18:36.495691 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:36.495647 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mqmcl_must-gather-5ldxt_41f3037d-df72-4a33-b4fd-cc32b03b81f4/gather/0.log" Apr 25 00:18:39.864467 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:39.864424 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bx6k8_4d4e321e-40d2-4107-9dbd-581cbfeb3ada/global-pull-secret-syncer/0.log" Apr 25 00:18:40.070834 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:40.070805 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xfncr_5de5dff9-24ac-4c52-a324-20a9923ea60b/konnectivity-agent/0.log" Apr 25 00:18:40.131853 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:40.131757 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-62.ec2.internal_68d8ca4751462a8913290f68bba7fb20/haproxy/0.log" Apr 25 00:18:41.875679 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:41.875639 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mqmcl/must-gather-5ldxt"] Apr 25 00:18:41.876089 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:41.875869 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="copy" containerID="cri-o://963ffaf0945811be8de5d4d2b0cb49e9bebaec6c887c56d8246eeaa3d51b77b8" gracePeriod=2 Apr 25 00:18:41.881482 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:41.881454 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mqmcl/must-gather-5ldxt"] Apr 25 00:18:41.990985 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:41.990954 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mqmcl_must-gather-5ldxt_41f3037d-df72-4a33-b4fd-cc32b03b81f4/copy/0.log" Apr 25 00:18:41.991367 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:41.991297 2566 generic.go:358] "Generic (PLEG): container finished" podID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerID="963ffaf0945811be8de5d4d2b0cb49e9bebaec6c887c56d8246eeaa3d51b77b8" exitCode=143 Apr 25 00:18:42.110263 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.110240 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mqmcl_must-gather-5ldxt_41f3037d-df72-4a33-b4fd-cc32b03b81f4/copy/0.log" Apr 25 00:18:42.110586 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.110568 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:42.112652 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.112626 2566 status_manager.go:895] "Failed to get status for pod" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" err="pods \"must-gather-5ldxt\" is forbidden: User \"system:node:ip-10-0-139-62.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-mqmcl\": no relationship found between node 'ip-10-0-139-62.ec2.internal' and this object" Apr 25 00:18:42.270614 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.270523 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw5hl\" (UniqueName: \"kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl\") pod \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " Apr 25 00:18:42.270614 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.270580 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output\") pod \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\" (UID: \"41f3037d-df72-4a33-b4fd-cc32b03b81f4\") " Apr 25 00:18:42.272067 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.272033 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "41f3037d-df72-4a33-b4fd-cc32b03b81f4" (UID: "41f3037d-df72-4a33-b4fd-cc32b03b81f4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 25 00:18:42.272844 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.272815 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl" (OuterVolumeSpecName: "kube-api-access-lw5hl") pod "41f3037d-df72-4a33-b4fd-cc32b03b81f4" (UID: "41f3037d-df72-4a33-b4fd-cc32b03b81f4"). InnerVolumeSpecName "kube-api-access-lw5hl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 25 00:18:42.371725 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.371688 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lw5hl\" (UniqueName: \"kubernetes.io/projected/41f3037d-df72-4a33-b4fd-cc32b03b81f4-kube-api-access-lw5hl\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:18:42.371725 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.371720 2566 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/41f3037d-df72-4a33-b4fd-cc32b03b81f4-must-gather-output\") on node \"ip-10-0-139-62.ec2.internal\" DevicePath \"\"" Apr 25 00:18:42.403499 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.403467 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" path="/var/lib/kubelet/pods/41f3037d-df72-4a33-b4fd-cc32b03b81f4/volumes" Apr 25 00:18:42.996201 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.996166 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mqmcl_must-gather-5ldxt_41f3037d-df72-4a33-b4fd-cc32b03b81f4/copy/0.log" Apr 25 00:18:42.996624 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.996562 2566 scope.go:117] "RemoveContainer" containerID="963ffaf0945811be8de5d4d2b0cb49e9bebaec6c887c56d8246eeaa3d51b77b8" Apr 25 00:18:42.996624 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:42.996611 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mqmcl/must-gather-5ldxt" Apr 25 00:18:43.003819 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.003793 2566 scope.go:117] "RemoveContainer" containerID="d7d0014579b7f17b650b10bf17142b0c25909a4ab9920e5767df1c9c38cf06b5" Apr 25 00:18:43.261893 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.261805 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-hsllp_c6db25df-5f50-488f-94b9-8d2c23f69077/cluster-monitoring-operator/0.log" Apr 25 00:18:43.382391 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.382359 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-566d956db8-6q454_98aa683f-2faf-4e36-85f7-e01b1d0148bf/metrics-server/0.log" Apr 25 00:18:43.538860 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.538822 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg6lt_6a77d3d5-2582-4083-80ef-aac6fdfc51b2/node-exporter/0.log" Apr 25 00:18:43.564061 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.564031 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg6lt_6a77d3d5-2582-4083-80ef-aac6fdfc51b2/kube-rbac-proxy/0.log" Apr 25 00:18:43.587333 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.587304 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rg6lt_6a77d3d5-2582-4083-80ef-aac6fdfc51b2/init-textfile/0.log" Apr 25 00:18:43.921255 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.921140 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vf7rk_88c50803-315c-421e-8a20-9c331d1c572e/prometheus-operator/0.log" Apr 25 00:18:43.940308 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:43.940258 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vf7rk_88c50803-315c-421e-8a20-9c331d1c572e/kube-rbac-proxy/0.log" Apr 25 00:18:46.264678 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:46.264640 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-sg89f_aa5c2105-93ed-4d18-af5a-1ffb27c236a3/download-server/0.log" Apr 25 00:18:47.274284 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.274252 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75ql5_6c180154-dc8b-44dc-a86b-9564a07e09c5/dns/0.log" Apr 25 00:18:47.292200 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.292167 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75ql5_6c180154-dc8b-44dc-a86b-9564a07e09c5/kube-rbac-proxy/0.log" Apr 25 00:18:47.335914 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.335876 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp"] Apr 25 00:18:47.336181 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336169 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="gather" Apr 25 00:18:47.336245 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336182 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="gather" Apr 25 00:18:47.336245 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336194 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="copy" Apr 25 00:18:47.336245 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336199 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="copy" Apr 25 00:18:47.336343 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336262 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="copy" Apr 25 00:18:47.336343 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.336270 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="41f3037d-df72-4a33-b4fd-cc32b03b81f4" containerName="gather" Apr 25 00:18:47.341426 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.341398 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.344347 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.344320 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"kube-root-ca.crt\"" Apr 25 00:18:47.344485 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.344406 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"openshift-service-ca.crt\"" Apr 25 00:18:47.345077 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.345054 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sv58q\"/\"default-dockercfg-h5ljq\"" Apr 25 00:18:47.345561 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.345539 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp"] Apr 25 00:18:47.462437 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.462403 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jgjtn_3a61a51c-e12c-4ab5-ac3f-b0d6b58d6ea2/dns-node-resolver/0.log" Apr 25 00:18:47.511024 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.510985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxp7\" (UniqueName: \"kubernetes.io/projected/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-kube-api-access-lqxp7\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.511024 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.511026 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-sys\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.511402 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.511245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-podres\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.511402 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.511285 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-proc\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.511402 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.511322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-lib-modules\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612323 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612279 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxp7\" (UniqueName: \"kubernetes.io/projected/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-kube-api-access-lqxp7\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612323 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612329 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-sys\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-podres\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612397 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-proc\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612423 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-lib-modules\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612494 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-sys\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-proc\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-lib-modules\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.612606 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.612543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-podres\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.620134 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.620101 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxp7\" (UniqueName: \"kubernetes.io/projected/e4d03b97-645f-4ed6-9c5a-0f53df2b5c62-kube-api-access-lqxp7\") pod \"perf-node-gather-daemonset-w49fp\" (UID: \"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.651959 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.651921 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:47.771149 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.771115 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp"] Apr 25 00:18:47.774571 ip-10-0-139-62 kubenswrapper[2566]: W0425 00:18:47.774537 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode4d03b97_645f_4ed6_9c5a_0f53df2b5c62.slice/crio-24f8f50a7984d5181d9423709ab1796de016615031b7d1174168b70d1266184f WatchSource:0}: Error finding container 24f8f50a7984d5181d9423709ab1796de016615031b7d1174168b70d1266184f: Status 404 returned error can't find the container with id 24f8f50a7984d5181d9423709ab1796de016615031b7d1174168b70d1266184f Apr 25 00:18:47.869381 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.869290 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29617920-dnclw_8570f506-620d-4a4f-8a35-f62c0a517524/image-pruner/0.log" Apr 25 00:18:47.916722 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:47.916692 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-64t8l_7fd21580-2e57-4cb2-8470-18fa0629553c/node-ca/0.log" Apr 25 00:18:48.013761 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.013726 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" event={"ID":"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62","Type":"ContainerStarted","Data":"a6ae923794081995d1cfb51b03c81fc04c14d63dca81b4d10b60e01f1bfa3d98"} Apr 25 00:18:48.013761 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.013761 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" event={"ID":"e4d03b97-645f-4ed6-9c5a-0f53df2b5c62","Type":"ContainerStarted","Data":"24f8f50a7984d5181d9423709ab1796de016615031b7d1174168b70d1266184f"} Apr 25 00:18:48.013996 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.013795 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:48.030450 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.030400 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" podStartSLOduration=1.030384698 podStartE2EDuration="1.030384698s" podCreationTimestamp="2026-04-25 00:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:18:48.028540435 +0000 UTC m=+1528.157748728" watchObservedRunningTime="2026-04-25 00:18:48.030384698 +0000 UTC m=+1528.159592977" Apr 25 00:18:48.677866 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.677834 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84b54f7d86-6hsq2_3fc3cbbb-c17a-46c2-bec2-3461b9d1bf31/router/0.log" Apr 25 00:18:48.991398 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:48.991299 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b956z_47c85934-321e-42e7-9abd-19c5dc8818e0/serve-healthcheck-canary/0.log" Apr 25 00:18:49.394802 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:49.394769 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvs6p_97e517a0-5f4e-40ee-b905-26216127aa87/kube-rbac-proxy/0.log" Apr 25 00:18:49.414672 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:49.414641 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvs6p_97e517a0-5f4e-40ee-b905-26216127aa87/exporter/0.log" Apr 25 00:18:49.469222 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:49.469177 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvs6p_97e517a0-5f4e-40ee-b905-26216127aa87/extractor/0.log" Apr 25 00:18:51.620044 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:51.620008 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-64c4d9588d-ddv76_687a38ca-26ee-4f6b-8d3e-26a0edaa1e1b/manager/0.log" Apr 25 00:18:51.653888 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:51.653857 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-njnrk_51d00fe8-d244-4e1b-b5cc-ad0affd91ea5/manager/0.log" Apr 25 00:18:54.027226 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:54.027183 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-w49fp" Apr 25 00:18:57.246632 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.246605 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/kube-multus-additional-cni-plugins/0.log" Apr 25 00:18:57.266195 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.266168 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/egress-router-binary-copy/0.log" Apr 25 00:18:57.285105 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.285022 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/cni-plugins/0.log" Apr 25 00:18:57.302388 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.302361 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/bond-cni-plugin/0.log" Apr 25 00:18:57.320939 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.320914 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/routeoverride-cni/0.log" Apr 25 00:18:57.338766 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.338739 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/whereabouts-cni-bincopy/0.log" Apr 25 00:18:57.356508 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.356484 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pcfrr_2aa55bd4-e281-4226-85cb-d9aa2ce0bd34/whereabouts-cni/0.log" Apr 25 00:18:57.518758 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.518726 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g4bsj_f7d067fa-72fb-42f4-92b9-edee24d3ed1e/kube-multus/0.log" Apr 25 00:18:57.537916 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.537840 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c6pqs_f9f062da-f1a8-4e5a-ac2f-ad672791353b/network-metrics-daemon/0.log" Apr 25 00:18:57.554190 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:57.554153 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c6pqs_f9f062da-f1a8-4e5a-ac2f-ad672791353b/kube-rbac-proxy/0.log" Apr 25 00:18:58.905728 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.905654 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-controller/0.log" Apr 25 00:18:58.920948 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.920910 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/0.log" Apr 25 00:18:58.934002 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.933974 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovn-acl-logging/1.log" Apr 25 00:18:58.957282 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.957247 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/kube-rbac-proxy-node/0.log" Apr 25 00:18:58.978683 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.978652 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:18:58.993057 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:58.993023 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/northd/0.log" Apr 25 00:18:59.010890 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:59.010858 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/nbdb/0.log" Apr 25 00:18:59.028771 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:59.028746 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/sbdb/0.log" Apr 25 00:18:59.182200 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:18:59.182108 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfv9v_f63c8907-4f05-4332-84e3-9ca9c74f643c/ovnkube-controller/0.log" Apr 25 00:19:00.193743 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:19:00.193708 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vq8nz_97ecc28e-c411-4b57-86a8-d793acbd08ad/network-check-target-container/0.log" Apr 25 00:19:01.048140 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:19:01.048110 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2cb6b_2e19a4fe-f15a-4907-a972-471635868ded/iptables-alerter/0.log" Apr 25 00:19:01.667003 ip-10-0-139-62 kubenswrapper[2566]: I0425 00:19:01.666970 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-skn6p_d0fe631a-be83-446b-90d8-57f1d40d01e3/tuned/0.log"