Apr 22 19:23:31.201055 ip-10-0-135-144 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:23:31.615438 ip-10-0-135-144 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.615438 ip-10-0-135-144 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:23:31.615438 ip-10-0-135-144 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.615438 ip-10-0-135-144 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:23:31.615438 ip-10-0-135-144 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:23:31.617765 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.617649 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:23:31.624260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624244 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.624260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624260 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624265 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624268 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624276 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624280 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624282 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624285 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624288 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624291 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624294 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624296 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624299 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624301 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624304 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624307 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624310 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624312 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624315 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624317 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624320 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.624336 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624323 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624327 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624329 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624332 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624334 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624337 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624340 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624342 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624345 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624348 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624351 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624355 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624359 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624361 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624364 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624367 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624370 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624372 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624375 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624377 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.624838 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624380 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624382 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624385 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624387 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624390 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624392 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624395 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624397 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624401 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624403 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624406 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624408 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624411 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624413 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624417 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624420 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624422 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624425 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624428 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624430 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.625370 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624433 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624436 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624438 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624441 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624443 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624446 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624448 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624450 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624455 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624458 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624462 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624464 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624468 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624471 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624474 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624476 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624479 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624482 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624486 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.625875 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624489 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624492 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624494 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624497 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624501 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624503 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624910 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624916 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624919 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624922 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624925 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624927 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624930 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624933 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624935 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624947 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624950 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624953 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624955 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624958 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.626339 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624961 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624964 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624967 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624970 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624973 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624975 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624978 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624982 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624984 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624987 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624989 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624992 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624995 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.624997 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625000 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625002 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625005 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625007 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625010 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625012 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.626884 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625015 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625018 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625020 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625023 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625026 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625030 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625032 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625036 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625039 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625042 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625045 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625048 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625050 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625053 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625056 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625059 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625062 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625064 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625067 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.627419 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625070 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625073 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625076 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625079 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625081 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625083 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625086 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625089 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625091 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625094 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625097 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625099 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625102 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625104 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625107 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625110 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625112 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625115 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625118 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625122 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.627913 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625124 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625127 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625132 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625135 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625137 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625140 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625142 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625145 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625148 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625151 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625158 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.625161 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626555 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626564 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626570 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626574 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626579 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626582 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626587 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626591 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:23:31.628398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626595 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626598 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626601 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626605 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626608 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626611 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626614 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626617 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626620 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626623 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626626 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626631 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626634 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626637 2574 flags.go:64] FLAG: --config-dir="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626640 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626643 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626647 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626650 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626654 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626658 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626661 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626667 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626671 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626675 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626678 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:23:31.628928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626682 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626685 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626688 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626691 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626694 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626697 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626702 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626705 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626708 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626711 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626728 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626732 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626735 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626738 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626741 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626745 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626747 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626750 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626753 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626756 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626759 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626762 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626766 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626769 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626772 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:23:31.629536 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626776 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626779 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626783 2574 flags.go:64] FLAG: --help="false" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626786 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626789 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626792 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626796 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626799 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626802 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626805 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626808 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626811 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626814 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626816 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626820 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626822 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626826 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626828 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626832 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626834 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626837 2574 flags.go:64] FLAG: --lock-file="" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626840 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626843 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626846 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:23:31.630162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626852 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626854 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626858 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626861 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626864 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626868 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626870 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626874 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626878 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626881 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626885 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626888 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626891 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626895 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626898 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626901 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626905 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626908 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626914 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626917 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626921 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626924 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626927 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:23:31.630764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626932 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626935 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626938 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626941 2574 flags.go:64] FLAG: --port="10250" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626944 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626947 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0530e40800535ebe4" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626950 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626954 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626957 2574 flags.go:64] FLAG: --register-node="true" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626960 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626963 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626967 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626970 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626973 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626976 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626979 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626983 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626986 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626988 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626991 2574 flags.go:64] FLAG: --runonce="false" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626994 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.626997 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627000 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627003 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627006 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627009 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:23:31.631367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627012 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627015 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627018 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627020 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627023 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627026 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627029 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627032 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627035 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627040 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627043 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627046 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627053 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627055 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627058 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627061 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627064 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627067 2574 flags.go:64] FLAG: --v="2" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627072 2574 flags.go:64] FLAG: --version="false" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627076 2574 flags.go:64] FLAG: --vmodule="" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627081 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627085 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627179 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627183 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.631997 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627186 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627189 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627193 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627195 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627198 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627201 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627204 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627206 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627209 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627211 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627214 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627216 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627219 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627221 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627224 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627226 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627228 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627231 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627233 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.632571 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627236 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627240 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627242 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627246 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627248 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627251 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627253 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627256 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627259 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627263 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627266 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627268 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627271 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627273 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627276 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627278 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627281 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627284 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627286 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.633125 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627289 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627293 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627296 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627300 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627302 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627305 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627308 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627311 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627313 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627316 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627318 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627321 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627324 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627326 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627348 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627352 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627355 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627358 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627361 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.633593 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627364 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627367 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627370 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627374 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627377 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627379 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627382 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627385 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627388 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627390 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627393 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627395 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627398 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627400 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627403 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627406 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627409 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627411 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627414 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627416 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.634203 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627419 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627421 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627424 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627427 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627430 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627432 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.627435 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.634993 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.627442 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.636045 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.636026 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:23:31.636084 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.636046 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636097 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636103 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636106 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636110 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636113 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.636116 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636117 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636120 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636123 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636126 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636128 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636131 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636134 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636136 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636139 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636142 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636144 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636147 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636149 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636152 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636155 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636158 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636161 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636164 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636166 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636169 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.636267 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636172 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636174 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636177 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636180 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636182 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636185 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636187 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636190 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636192 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636195 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636197 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636200 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636203 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636205 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636209 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636212 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636214 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636217 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636221 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.636775 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636224 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636226 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636229 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636231 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636234 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636236 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636239 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636242 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636244 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636247 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636250 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636252 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636255 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636257 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636260 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636264 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636267 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636271 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636273 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636276 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.637261 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636279 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636282 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636285 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636288 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636293 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636295 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636298 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636301 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636304 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636307 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636310 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636313 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636316 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636318 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636321 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636323 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636326 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636328 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636331 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.637796 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636333 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636336 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636338 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.636343 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636461 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636467 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636470 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636472 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636476 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636478 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636481 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636483 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636486 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636488 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636491 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:23:31.638260 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636494 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636496 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636500 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636502 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636505 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636508 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636510 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636513 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636516 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636519 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636521 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636524 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636527 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636529 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636532 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636534 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636537 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636539 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636542 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636545 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:23:31.638629 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636547 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636549 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636552 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636554 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636556 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636559 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636561 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636564 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636566 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636569 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636571 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636573 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636576 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636579 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636581 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636584 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636586 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636589 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636592 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636594 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:23:31.639121 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636597 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636599 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636602 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636604 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636607 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636610 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636612 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636615 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636617 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636620 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636622 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636625 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636627 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636630 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636632 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636634 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636637 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636639 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636642 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636644 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:23:31.639601 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636647 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636651 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636654 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636657 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636660 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636662 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636665 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636667 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636670 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636673 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636675 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636678 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636680 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636684 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:23:31.640104 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:31.636687 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:23:31.640460 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.636692 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:23:31.640460 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.637426 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:23:31.640460 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.639552 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:23:31.640460 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.640411 2574 server.go:1019] "Starting client certificate rotation" Apr 22 19:23:31.640562 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.640507 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:31.640562 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.640544 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:23:31.663011 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.662992 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:31.665947 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.665902 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:23:31.675267 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.675247 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:23:31.681320 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.681304 2574 log.go:25] "Validated CRI v1 image API" Apr 22 19:23:31.682558 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.682545 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:23:31.686001 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.685983 2574 fs.go:135] Filesystem UUIDs: map[23f52f01-de22-4e53-8279-6a0db20195c6:/dev/nvme0n1p3 6cedb118-ee39-4c6a-bb45-093537e05de0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 19:23:31.686066 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.686001 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:23:31.692902 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.692785 2574 manager.go:217] Machine: {Timestamp:2026-04-22 19:23:31.690843094 +0000 UTC m=+0.379866186 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113722 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d895aa8a29f4d263968fdb66284da SystemUUID:ec2d895a-a8a2-9f4d-2639-68fdb66284da BootID:dead1ae3-49ec-4f95-92bf-d623e9f894ed Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c4:73:19:a5:c7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c4:73:19:a5:c7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:13:f0:70:62:83 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:23:31.693485 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.693471 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:23:31.693601 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.693584 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:23:31.696283 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696257 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:23:31.696443 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696286 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:23:31.696519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696454 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:23:31.696519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696467 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:23:31.696519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696486 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:31.696736 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.696699 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:31.697346 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.697332 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:23:31.699169 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.699157 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:31.699299 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.699288 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:23:31.701797 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.701783 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:23:31.701839 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.701800 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:23:31.701839 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.701812 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:23:31.701839 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.701821 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:23:31.701839 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.701830 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:23:31.702812 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.702801 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:31.702856 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.702845 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:23:31.705807 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.705792 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:23:31.707195 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.707182 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:23:31.708802 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708785 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:23:31.708846 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708833 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:23:31.708846 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708844 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:23:31.708901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708863 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:23:31.708901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708872 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:23:31.708901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708881 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:23:31.708901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708889 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:23:31.708901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708898 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:23:31.709038 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708908 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:23:31.709038 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708917 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:23:31.709038 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708946 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:23:31.709038 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.708960 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:23:31.709769 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.709708 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:23:31.709800 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.709781 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:23:31.713477 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.713463 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:23:31.713546 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.713497 2574 server.go:1295] "Started kubelet" Apr 22 19:23:31.713645 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.713591 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:23:31.713808 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.713671 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:23:31.714138 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.714121 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:23:31.714423 ip-10-0-135-144 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:23:31.715369 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.715314 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:23:31.716764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.716748 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:23:31.716931 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.716895 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:23:31.717005 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.716908 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:23:31.717005 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.716985 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:23:31.720352 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.720322 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7h7vg" Apr 22 19:23:31.721975 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.721958 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:31.722470 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.722450 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:23:31.722559 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.721542 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-144.ec2.internal.18a8c43b3d9cff58 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-144.ec2.internal,UID:ip-10-0-135-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-144.ec2.internal,},FirstTimestamp:2026-04-22 19:23:31.713474392 +0000 UTC m=+0.402497484,LastTimestamp:2026-04-22 19:23:31.713474392 +0000 UTC m=+0.402497484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-144.ec2.internal,}" Apr 22 19:23:31.722987 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.722967 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:23:31.723061 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.722993 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:23:31.723061 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.722999 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:23:31.723061 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.723061 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:23:31.723212 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.723070 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:23:31.723264 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.723207 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:31.724966 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.724949 2574 factory.go:55] Registering systemd factory Apr 22 19:23:31.725051 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725029 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:23:31.725347 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725332 2574 factory.go:153] Registering CRI-O factory Apr 22 19:23:31.725347 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725349 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 19:23:31.725495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725397 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:23:31.725495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725425 2574 factory.go:103] Registering Raw factory Apr 22 19:23:31.725495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725439 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 19:23:31.725883 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.725867 2574 manager.go:319] Starting recovery of all containers Apr 22 19:23:31.726000 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.725963 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:23:31.726087 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.726002 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:23:31.726087 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.726022 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:23:31.728136 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.728113 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7h7vg" Apr 22 19:23:31.738120 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.738001 2574 manager.go:324] Recovery completed Apr 22 19:23:31.742062 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.742050 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.745981 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.745965 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.746064 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.745992 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.746064 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.746005 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.746471 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.746457 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:23:31.746471 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.746468 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:23:31.746554 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.746502 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:23:31.748807 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.748796 2574 policy_none.go:49] "None policy: Start" Apr 22 19:23:31.748843 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.748810 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:23:31.748843 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.748820 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790296 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.790324 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790333 2574 server.go:85] "Starting device plugin registration server" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790565 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790577 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790676 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790776 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.790784 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.791312 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:23:31.792810 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.791348 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:31.859310 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.859286 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:23:31.860408 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.860380 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:23:31.860408 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.860408 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:23:31.860544 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.860425 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:23:31.860544 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.860431 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:23:31.860544 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.860461 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:23:31.863102 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.863078 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:31.891498 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.891446 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.892399 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.892384 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.892487 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.892416 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.892487 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.892435 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.892487 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.892464 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.902954 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.902936 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.903013 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.902957 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-144.ec2.internal\": node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:31.934253 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.934231 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:31.961507 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.961485 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal"] Apr 22 19:23:31.961585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.961556 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.963149 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.963134 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.963233 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.963168 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.963233 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.963182 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.964507 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.964492 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.964645 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.964630 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.964690 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.964662 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.965384 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965355 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.965384 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965373 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.965502 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965388 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.965502 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965398 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.965502 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965404 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.965502 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.965409 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.966668 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.966651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.966775 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.966675 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:23:31.967343 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.967328 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:23:31.967400 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.967356 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:23:31.967400 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:31.967369 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:23:31.991306 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.991289 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-144.ec2.internal\" not found" node="ip-10-0-135-144.ec2.internal" Apr 22 19:23:31.995530 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:31.995515 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-144.ec2.internal\" not found" node="ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.024176 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.024150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.024176 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.024177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.024319 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.024195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fab854a3125f200d91109b3a7636112a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"fab854a3125f200d91109b3a7636112a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.034418 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.034400 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.125327 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.125428 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.125428 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fab854a3125f200d91109b3a7636112a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"fab854a3125f200d91109b3a7636112a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.125428 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.125428 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125402 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fab854a3125f200d91109b3a7636112a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-144.ec2.internal\" (UID: \"fab854a3125f200d91109b3a7636112a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.125428 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.125413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d2b23199daad0afd959c853a99042fee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal\" (UID: \"d2b23199daad0afd959c853a99042fee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.135323 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.135302 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.236476 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.236403 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.293579 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.293551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.298254 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.298234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.337350 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.337326 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.437867 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.437823 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.538394 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.538310 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.638823 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.638790 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.639879 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.639861 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:23:32.640007 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.639990 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:23:32.660669 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.660648 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:32.698885 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.698862 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:32.722506 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.722482 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:23:32.730853 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.730824 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:18:31 +0000 UTC" deadline="2027-10-12 06:53:14.484040999 +0000 UTC" Apr 22 19:23:32.730955 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.730850 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12899h29m41.753193392s" Apr 22 19:23:32.734325 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.734308 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:23:32.738856 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.738836 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.758952 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.758934 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g2pdg" Apr 22 19:23:32.767163 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.767147 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g2pdg" Apr 22 19:23:32.768426 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:32.768401 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b23199daad0afd959c853a99042fee.slice/crio-ae7a7245a2ea8d4c13ea0509107c85a72d7a46b691160114dd9d94bc9141e267 WatchSource:0}: Error finding container ae7a7245a2ea8d4c13ea0509107c85a72d7a46b691160114dd9d94bc9141e267: Status 404 returned error can't find the container with id ae7a7245a2ea8d4c13ea0509107c85a72d7a46b691160114dd9d94bc9141e267 Apr 22 19:23:32.768731 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:32.768694 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfab854a3125f200d91109b3a7636112a.slice/crio-996f7c3c66880731b665141e04f1df2cab43621a21fef5b1b66c2f99ac66d575 WatchSource:0}: Error finding container 996f7c3c66880731b665141e04f1df2cab43621a21fef5b1b66c2f99ac66d575: Status 404 returned error can't find the container with id 996f7c3c66880731b665141e04f1df2cab43621a21fef5b1b66c2f99ac66d575 Apr 22 19:23:32.772210 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.772196 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:32.839403 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:32.839346 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-144.ec2.internal\" not found" Apr 22 19:23:32.863287 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.863245 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"d2b23199daad0afd959c853a99042fee","Type":"ContainerStarted","Data":"ae7a7245a2ea8d4c13ea0509107c85a72d7a46b691160114dd9d94bc9141e267"} Apr 22 19:23:32.864170 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.864152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" event={"ID":"fab854a3125f200d91109b3a7636112a","Type":"ContainerStarted","Data":"996f7c3c66880731b665141e04f1df2cab43621a21fef5b1b66c2f99ac66d575"} Apr 22 19:23:32.883243 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.883222 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:32.923629 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.923589 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.936723 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.936705 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:32.937616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.937604 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" Apr 22 19:23:32.944227 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:32.944207 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:23:33.494935 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.494901 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:23:33.703374 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.703344 2574 apiserver.go:52] "Watching apiserver" Apr 22 19:23:33.711885 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.711863 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:23:33.714240 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.714213 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-27hl5","openshift-cluster-node-tuning-operator/tuned-dtzjz","openshift-image-registry/node-ca-b9dhn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal","openshift-multus/multus-additional-cni-plugins-grxvt","openshift-multus/multus-mz5vn","openshift-network-diagnostics/network-check-target-fhzh7","openshift-ovn-kubernetes/ovnkube-node-rffw9","kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg","openshift-dns/node-resolver-cqg2m","openshift-multus/network-metrics-daemon-rwxr2","openshift-network-operator/iptables-alerter-5zp6w"] Apr 22 19:23:33.715883 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.715861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.717061 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.717043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.718876 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.718778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:23:33.719066 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.719046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.720073 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720044 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:23:33.720073 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720062 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:23:33.720246 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.720246 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720062 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.720368 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720272 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.720423 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720410 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:23:33.720531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4bwq8\"" Apr 22 19:23:33.720798 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.720782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.721531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.721386 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:23:33.721531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.721409 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.721531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.721437 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.722140 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.722120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.722260 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.722170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2hncx\"" Apr 22 19:23:33.722761 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.722128 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.724600 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.723853 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:23:33.724600 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.724361 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.724816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.724606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.724816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.724707 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gk94m\"" Apr 22 19:23:33.725197 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.725071 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.725197 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.725113 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nzrtl\"" Apr 22 19:23:33.726162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.726043 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:23:33.726162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.726143 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:23:33.727222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.726849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.727222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.726882 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:23:33.727222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.726987 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-78k8d\"" Apr 22 19:23:33.727222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.727067 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.729133 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:33.729237 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.729184 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:33.729310 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.729433 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729401 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.729624 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729610 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:23:33.729706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729649 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.729706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.729649 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8hszl\"" Apr 22 19:23:33.730503 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.730486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.731586 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.731570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wz2bv\"" Apr 22 19:23:33.731847 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.731817 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:23:33.731946 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.731925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:33.732043 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.732023 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:33.733119 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.733096 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.733237 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.733219 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xfjtr\"" Apr 22 19:23:33.733310 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.733223 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:23:33.733388 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.733357 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:23:33.734406 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2scc\" (UniqueName: \"kubernetes.io/projected/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kube-api-access-j2scc\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.734511 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.734511 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-log-socket\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.734511 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-tuned\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.734511 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab5f4389-6ac2-4eab-b05a-9657f9124db1-host-slash\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-cnibin\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-bin\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-registration-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-run\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-var-lib-kubelet\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.734709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysconfig\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k29j\" (UniqueName: \"kubernetes.io/projected/ab5f4389-6ac2-4eab-b05a-9657f9124db1-kube-api-access-2k29j\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/001543e3-9932-4f51-a285-c188ebe53071-ovn-node-metrics-cert\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-os-release\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-netd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-socket-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-netns\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmmr\" (UniqueName: \"kubernetes.io/projected/001543e3-9932-4f51-a285-c188ebe53071-kube-api-access-tgmmr\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-tmp\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.734965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-device-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-var-lib-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-script-lib\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-conf\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735066 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-host\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735110 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-sys-fs\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735156 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-node-log\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmzzz\" (UniqueName: \"kubernetes.io/projected/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-kube-api-access-fmzzz\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/277e1d66-9594-42a8-b953-7fcddeac7dad-host\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9ltq\" (UniqueName: \"kubernetes.io/projected/277e1d66-9594-42a8-b953-7fcddeac7dad-kube-api-access-c9ltq\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-config\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-systemd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-etc-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ab5f4389-6ac2-4eab-b05a-9657f9124db1-iptables-alerter-script\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.735568 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-lib-modules\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-env-overrides\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-binary-copy\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnq6q\" (UniqueName: \"kubernetes.io/projected/eb28e67f-4312-4175-a5fa-26a033fdf402-kube-api-access-qnq6q\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735541 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-kubelet\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-systemd-units\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-kubernetes\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735620 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/277e1d66-9594-42a8-b953-7fcddeac7dad-serviceca\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-slash\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-ovn\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-modprobe-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735709 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qn599\"" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-sys\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-system-cni-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-etc-selinux\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-systemd\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735907 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:23:33.736216 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.735907 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:23:33.768023 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.767948 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:32 +0000 UTC" deadline="2027-12-03 17:04:42.209749247 +0000 UTC" Apr 22 19:23:33.768023 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.767982 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14157h41m8.441773645s" Apr 22 19:23:33.824332 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.824307 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:23:33.836521 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-sys\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.836639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836527 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-bin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.836639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-system-cni-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.836639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-netns\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.836639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836623 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-system-cni-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-sys\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-etc-selinux\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-systemd\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-multus\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-kubelet\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836774 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-systemd\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-etc-kubernetes\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9z9h\" (UniqueName: \"kubernetes.io/projected/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-kube-api-access-s9z9h\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-etc-selinux\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.836868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2scc\" (UniqueName: \"kubernetes.io/projected/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kube-api-access-j2scc\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-log-socket\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-tuned\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab5f4389-6ac2-4eab-b05a-9657f9124db1-host-slash\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.836993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-cnibin\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-log-socket\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-bin\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-registration-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-run\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837067 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ab5f4389-6ac2-4eab-b05a-9657f9124db1-host-slash\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-var-lib-kubelet\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-k8s-cni-cncf-io\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-conf-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-cnibin\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.837348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-bin\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysconfig\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k29j\" (UniqueName: \"kubernetes.io/projected/ab5f4389-6ac2-4eab-b05a-9657f9124db1-kube-api-access-2k29j\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-var-lib-kubelet\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837351 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837375 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-registration-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysconfig\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-tuning-conf-dir\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-cnibin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/001543e3-9932-4f51-a285-c188ebe53071-ovn-node-metrics-cert\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-os-release\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-konnectivity-ca\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-netd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-socket-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-netns\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-run\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb28e67f-4312-4175-a5fa-26a033fdf402-os-release\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-cni-netd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmmr\" (UniqueName: \"kubernetes.io/projected/001543e3-9932-4f51-a285-c188ebe53071-kube-api-access-tgmmr\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-tmp\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837743 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-hostroot\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-tmp-dir\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-socket-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-device-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-var-lib-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-script-lib\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-conf\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-host\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-system-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-sys-fs\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.837698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-netns\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.838972 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-sys-fs\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-device-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-host\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-script-lib\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-var-lib-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-os-release\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-cni-binary-copy\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbb4\" (UniqueName: \"kubernetes.io/projected/dbd63835-2911-4f84-8572-eceb35993627-kube-api-access-xtbb4\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-conf\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-node-log\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-node-log\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmzzz\" (UniqueName: \"kubernetes.io/projected/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-kube-api-access-fmzzz\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.839626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.838987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/277e1d66-9594-42a8-b953-7fcddeac7dad-host\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9ltq\" (UniqueName: \"kubernetes.io/projected/277e1d66-9594-42a8-b953-7fcddeac7dad-kube-api-access-c9ltq\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-sysctl-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/277e1d66-9594-42a8-b953-7fcddeac7dad-host\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksst4\" (UniqueName: \"kubernetes.io/projected/3f7c0766-21b4-4016-9a86-f022651a4b2e-kube-api-access-ksst4\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-config\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-systemd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-etc-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-agent-certs\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ab5f4389-6ac2-4eab-b05a-9657f9124db1-iptables-alerter-script\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-systemd\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-etc-openvswitch\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839289 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-multus-daemon-config\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-multus-certs\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839338 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-hosts-file\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.840203 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-lib-modules\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-env-overrides\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839442 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-run-ovn-kubernetes\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-binary-copy\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnq6q\" (UniqueName: \"kubernetes.io/projected/eb28e67f-4312-4175-a5fa-26a033fdf402-kube-api-access-qnq6q\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839535 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-socket-dir-parent\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-lib-modules\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-kubelet\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-systemd-units\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-kubernetes\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/277e1d66-9594-42a8-b953-7fcddeac7dad-serviceca\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839655 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-ovnkube-config\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-slash\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-ovn\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.840792 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ab5f4389-6ac2-4eab-b05a-9657f9124db1-iptables-alerter-script\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-kubernetes\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-modprobe-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-kubelet\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-host-slash\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-run-ovn\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.839883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/001543e3-9932-4f51-a285-c188ebe53071-systemd-units\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.840019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-modprobe-d\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.840026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/001543e3-9932-4f51-a285-c188ebe53071-env-overrides\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.840484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb28e67f-4312-4175-a5fa-26a033fdf402-cni-binary-copy\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.840705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/277e1d66-9594-42a8-b953-7fcddeac7dad-serviceca\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.841316 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/001543e3-9932-4f51-a285-c188ebe53071-ovn-node-metrics-cert\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.841424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.841358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-tmp\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.841922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.841785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-etc-tuned\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.847847 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.847824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k29j\" (UniqueName: \"kubernetes.io/projected/ab5f4389-6ac2-4eab-b05a-9657f9124db1-kube-api-access-2k29j\") pod \"iptables-alerter-5zp6w\" (UID: \"ab5f4389-6ac2-4eab-b05a-9657f9124db1\") " pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:33.848288 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.848262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmmr\" (UniqueName: \"kubernetes.io/projected/001543e3-9932-4f51-a285-c188ebe53071-kube-api-access-tgmmr\") pod \"ovnkube-node-rffw9\" (UID: \"001543e3-9932-4f51-a285-c188ebe53071\") " pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:33.849519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.849215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9ltq\" (UniqueName: \"kubernetes.io/projected/277e1d66-9594-42a8-b953-7fcddeac7dad-kube-api-access-c9ltq\") pod \"node-ca-b9dhn\" (UID: \"277e1d66-9594-42a8-b953-7fcddeac7dad\") " pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:33.849519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.849324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnq6q\" (UniqueName: \"kubernetes.io/projected/eb28e67f-4312-4175-a5fa-26a033fdf402-kube-api-access-qnq6q\") pod \"multus-additional-cni-plugins-grxvt\" (UID: \"eb28e67f-4312-4175-a5fa-26a033fdf402\") " pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:33.849519 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.849486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmzzz\" (UniqueName: \"kubernetes.io/projected/ed659788-ce5c-4f08-b7a2-84ca2fdda6df-kube-api-access-fmzzz\") pod \"tuned-dtzjz\" (UID: \"ed659788-ce5c-4f08-b7a2-84ca2fdda6df\") " pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:33.849921 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.849898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2scc\" (UniqueName: \"kubernetes.io/projected/72a6beac-e488-4f4d-8f5b-a6a038b1a99d-kube-api-access-j2scc\") pod \"aws-ebs-csi-driver-node-74xsg\" (UID: \"72a6beac-e488-4f4d-8f5b-a6a038b1a99d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:33.940493 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-k8s-cni-cncf-io\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-conf-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-cnibin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-konnectivity-ca\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-k8s-cni-cncf-io\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-conf-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-cnibin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.940652 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-hostroot\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-tmp-dir\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-hostroot\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-system-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-os-release\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-cni-binary-copy\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-system-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbb4\" (UniqueName: \"kubernetes.io/projected/dbd63835-2911-4f84-8572-eceb35993627-kube-api-access-xtbb4\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-os-release\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksst4\" (UniqueName: \"kubernetes.io/projected/3f7c0766-21b4-4016-9a86-f022651a4b2e-kube-api-access-ksst4\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-agent-certs\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-multus-daemon-config\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-multus-certs\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941008 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.940991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-hosts-file\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-socket-dir-parent\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-hosts-file\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-bin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-konnectivity-ca\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-netns\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-multus-certs\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-multus\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-socket-dir-parent\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-kubelet\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-bin\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-multus-cni-dir\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-etc-kubernetes\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941210 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-run-netns\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-cni-multus\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-host-var-lib-kubelet\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dbd63835-2911-4f84-8572-eceb35993627-etc-kubernetes\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9z9h\" (UniqueName: \"kubernetes.io/projected/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-kube-api-access-s9z9h\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.941758 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.941325 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.942431 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-tmp-dir\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.942431 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-cni-binary-copy\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.942431 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.941449 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.441416966 +0000 UTC m=+3.130440045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:33.942431 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.941620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/dbd63835-2911-4f84-8572-eceb35993627-multus-daemon-config\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.943371 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.943351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/027e0f5b-b1d5-4ef4-a370-2ba4520f5d94-agent-certs\") pod \"konnectivity-agent-27hl5\" (UID: \"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94\") " pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:33.949603 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.949581 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:33.949693 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.949606 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:33.949693 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.949624 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:33.949693 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:33.949689 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:34.449673265 +0000 UTC m=+3.138696367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:33.951660 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.951621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9z9h\" (UniqueName: \"kubernetes.io/projected/3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2-kube-api-access-s9z9h\") pod \"node-resolver-cqg2m\" (UID: \"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2\") " pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:33.951925 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.951902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbb4\" (UniqueName: \"kubernetes.io/projected/dbd63835-2911-4f84-8572-eceb35993627-kube-api-access-xtbb4\") pod \"multus-mz5vn\" (UID: \"dbd63835-2911-4f84-8572-eceb35993627\") " pod="openshift-multus/multus-mz5vn" Apr 22 19:23:33.952013 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:33.951997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksst4\" (UniqueName: \"kubernetes.io/projected/3f7c0766-21b4-4016-9a86-f022651a4b2e-kube-api-access-ksst4\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:34.029509 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.029432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:34.037297 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.037274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" Apr 22 19:23:34.045013 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.044992 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9dhn" Apr 22 19:23:34.050547 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.050527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5zp6w" Apr 22 19:23:34.058102 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.058083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-grxvt" Apr 22 19:23:34.065350 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.065329 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" Apr 22 19:23:34.070955 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.070933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mz5vn" Apr 22 19:23:34.078412 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.078395 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cqg2m" Apr 22 19:23:34.083958 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.083938 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:34.425376 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.425347 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd63835_2911_4f84_8572_eceb35993627.slice/crio-719cc5d72e6de796f74704cba8d80a5865dd18e08354b475a89255b4b6b09708 WatchSource:0}: Error finding container 719cc5d72e6de796f74704cba8d80a5865dd18e08354b475a89255b4b6b09708: Status 404 returned error can't find the container with id 719cc5d72e6de796f74704cba8d80a5865dd18e08354b475a89255b4b6b09708 Apr 22 19:23:34.426258 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.426229 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab5f4389_6ac2_4eab_b05a_9657f9124db1.slice/crio-06bab9c80a0c6cbb11ff0058fddddf3529dc392e7534685f8f8d755abecea670 WatchSource:0}: Error finding container 06bab9c80a0c6cbb11ff0058fddddf3529dc392e7534685f8f8d755abecea670: Status 404 returned error can't find the container with id 06bab9c80a0c6cbb11ff0058fddddf3529dc392e7534685f8f8d755abecea670 Apr 22 19:23:34.427505 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.427429 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ab42ccb_aaa7_4fc9_ba6b_0440f5b86ef2.slice/crio-29754a1619e6d532f949f189b00912431ca4ea26aa67070e01db8c806ab73f4f WatchSource:0}: Error finding container 29754a1619e6d532f949f189b00912431ca4ea26aa67070e01db8c806ab73f4f: Status 404 returned error can't find the container with id 29754a1619e6d532f949f189b00912431ca4ea26aa67070e01db8c806ab73f4f Apr 22 19:23:34.428153 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.428132 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded659788_ce5c_4f08_b7a2_84ca2fdda6df.slice/crio-e8f6b0b401dc0850b2709808c12a8c5d922825b833810b7f884c4ca056fff930 WatchSource:0}: Error finding container e8f6b0b401dc0850b2709808c12a8c5d922825b833810b7f884c4ca056fff930: Status 404 returned error can't find the container with id e8f6b0b401dc0850b2709808c12a8c5d922825b833810b7f884c4ca056fff930 Apr 22 19:23:34.431073 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.431026 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027e0f5b_b1d5_4ef4_a370_2ba4520f5d94.slice/crio-44efdfd2653336101cadade0e457b27087058ca2bbddc4a6e73dcd583bd4b9c1 WatchSource:0}: Error finding container 44efdfd2653336101cadade0e457b27087058ca2bbddc4a6e73dcd583bd4b9c1: Status 404 returned error can't find the container with id 44efdfd2653336101cadade0e457b27087058ca2bbddc4a6e73dcd583bd4b9c1 Apr 22 19:23:34.431668 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.431647 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod001543e3_9932_4f51_a285_c188ebe53071.slice/crio-932b272d8460afea0557a02ca4694372a9aedd422bda0cb67e90bc42c52a7d8d WatchSource:0}: Error finding container 932b272d8460afea0557a02ca4694372a9aedd422bda0cb67e90bc42c52a7d8d: Status 404 returned error can't find the container with id 932b272d8460afea0557a02ca4694372a9aedd422bda0cb67e90bc42c52a7d8d Apr 22 19:23:34.432818 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.432794 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod277e1d66_9594_42a8_b953_7fcddeac7dad.slice/crio-bc83b472e5b2419df450d73a67d474fd435336f392e0c56782f0b2cbaa1cf38a WatchSource:0}: Error finding container bc83b472e5b2419df450d73a67d474fd435336f392e0c56782f0b2cbaa1cf38a: Status 404 returned error can't find the container with id bc83b472e5b2419df450d73a67d474fd435336f392e0c56782f0b2cbaa1cf38a Apr 22 19:23:34.433599 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.433581 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a6beac_e488_4f4d_8f5b_a6a038b1a99d.slice/crio-dd557d8a714e24561be4ab016564670c0c03125fb1cc17475428ba9a6baf5967 WatchSource:0}: Error finding container dd557d8a714e24561be4ab016564670c0c03125fb1cc17475428ba9a6baf5967: Status 404 returned error can't find the container with id dd557d8a714e24561be4ab016564670c0c03125fb1cc17475428ba9a6baf5967 Apr 22 19:23:34.435594 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:23:34.435573 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb28e67f_4312_4175_a5fa_26a033fdf402.slice/crio-c888f85359d3646a307c1ca054b86ed7035f233270b0eff372424feae3a30371 WatchSource:0}: Error finding container c888f85359d3646a307c1ca054b86ed7035f233270b0eff372424feae3a30371: Status 404 returned error can't find the container with id c888f85359d3646a307c1ca054b86ed7035f233270b0eff372424feae3a30371 Apr 22 19:23:34.444763 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.444697 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:34.444904 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.444878 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.445000 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.444962 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.444929602 +0000 UTC m=+4.133952685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:34.545411 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.545244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:34.545554 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.545393 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:34.545554 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.545473 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:34.545554 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.545487 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.545640 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.545572 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:35.545555349 +0000 UTC m=+4.234578429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:34.768528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.768396 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:18:32 +0000 UTC" deadline="2027-10-17 16:26:54.291909915 +0000 UTC" Apr 22 19:23:34.768528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.768433 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13029h3m19.523479589s" Apr 22 19:23:34.861001 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.860860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:34.861199 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:34.861167 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:34.883045 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.883006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerStarted","Data":"c888f85359d3646a307c1ca054b86ed7035f233270b0eff372424feae3a30371"} Apr 22 19:23:34.891092 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.891057 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" event={"ID":"72a6beac-e488-4f4d-8f5b-a6a038b1a99d","Type":"ContainerStarted","Data":"dd557d8a714e24561be4ab016564670c0c03125fb1cc17475428ba9a6baf5967"} Apr 22 19:23:34.894374 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.894309 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9dhn" event={"ID":"277e1d66-9594-42a8-b953-7fcddeac7dad","Type":"ContainerStarted","Data":"bc83b472e5b2419df450d73a67d474fd435336f392e0c56782f0b2cbaa1cf38a"} Apr 22 19:23:34.897557 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.897497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"932b272d8460afea0557a02ca4694372a9aedd422bda0cb67e90bc42c52a7d8d"} Apr 22 19:23:34.901501 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.901427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" event={"ID":"ed659788-ce5c-4f08-b7a2-84ca2fdda6df","Type":"ContainerStarted","Data":"e8f6b0b401dc0850b2709808c12a8c5d922825b833810b7f884c4ca056fff930"} Apr 22 19:23:34.909901 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.909879 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cqg2m" event={"ID":"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2","Type":"ContainerStarted","Data":"29754a1619e6d532f949f189b00912431ca4ea26aa67070e01db8c806ab73f4f"} Apr 22 19:23:34.912566 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.912538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5zp6w" event={"ID":"ab5f4389-6ac2-4eab-b05a-9657f9124db1","Type":"ContainerStarted","Data":"06bab9c80a0c6cbb11ff0058fddddf3529dc392e7534685f8f8d755abecea670"} Apr 22 19:23:34.920774 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.920739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" event={"ID":"fab854a3125f200d91109b3a7636112a","Type":"ContainerStarted","Data":"492ee15542d7b37f13a89e1f8e4656b674d020e056bc2b84b750214d16a04f55"} Apr 22 19:23:34.930916 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.930893 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-27hl5" event={"ID":"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94","Type":"ContainerStarted","Data":"44efdfd2653336101cadade0e457b27087058ca2bbddc4a6e73dcd583bd4b9c1"} Apr 22 19:23:34.941261 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.941205 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-144.ec2.internal" podStartSLOduration=2.94118918 podStartE2EDuration="2.94118918s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:34.940522176 +0000 UTC m=+3.629545278" watchObservedRunningTime="2026-04-22 19:23:34.94118918 +0000 UTC m=+3.630212283" Apr 22 19:23:34.943129 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:34.943107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mz5vn" event={"ID":"dbd63835-2911-4f84-8572-eceb35993627","Type":"ContainerStarted","Data":"719cc5d72e6de796f74704cba8d80a5865dd18e08354b475a89255b4b6b09708"} Apr 22 19:23:35.453160 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:35.453132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:35.453301 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.453284 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:35.453376 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.453346 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:23:37.453326629 +0000 UTC m=+6.142349709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:35.553731 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:35.553656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:35.553897 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.553870 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:35.553897 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.553890 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:35.553997 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.553903 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:35.553997 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.553959 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:37.553939407 +0000 UTC m=+6.242962499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:35.863365 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:35.863285 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:35.863815 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:35.863429 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:35.956485 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:35.955461 2574 generic.go:358] "Generic (PLEG): container finished" podID="d2b23199daad0afd959c853a99042fee" containerID="585e499d4cc96de015e5ba4afa2e502f2468ec9a5539773603811ba04035792b" exitCode=0 Apr 22 19:23:35.956485 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:35.956438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"d2b23199daad0afd959c853a99042fee","Type":"ContainerDied","Data":"585e499d4cc96de015e5ba4afa2e502f2468ec9a5539773603811ba04035792b"} Apr 22 19:23:36.861352 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:36.861319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:36.861572 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:36.861500 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:36.977217 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:36.977149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" event={"ID":"d2b23199daad0afd959c853a99042fee","Type":"ContainerStarted","Data":"86641227e3b5f121b3fa0dd5aac8833985fea9c4d03c99bcf1d98d0d4ce337c7"} Apr 22 19:23:36.994821 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:36.993871 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-144.ec2.internal" podStartSLOduration=4.993850366 podStartE2EDuration="4.993850366s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:36.9936047 +0000 UTC m=+5.682627805" watchObservedRunningTime="2026-04-22 19:23:36.993850366 +0000 UTC m=+5.682873468" Apr 22 19:23:37.468374 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:37.468326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:37.468621 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.468509 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:37.468621 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.468578 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:23:41.468558922 +0000 UTC m=+10.157582006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:37.569457 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:37.569413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:37.569620 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.569599 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:37.569620 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.569620 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:37.569769 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.569633 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:37.569769 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.569692 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:41.569673017 +0000 UTC m=+10.258696099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:37.862235 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:37.861799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:37.862235 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:37.861953 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:38.861935 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:38.861497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:38.861935 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:38.861648 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:39.861193 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:39.860685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:39.861193 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:39.860827 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:40.860783 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:40.860750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:40.861218 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:40.860880 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:41.505309 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:41.505262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:41.505595 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.505409 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:41.505595 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.505527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.505469241 +0000 UTC m=+18.194492326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:41.605926 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:41.605737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:41.605926 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.605920 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:41.606154 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.605939 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:41.606154 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.605951 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:41.606154 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.606008 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:23:49.605988111 +0000 UTC m=+18.295011204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:41.861640 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:41.861546 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:41.862106 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:41.861679 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:42.860668 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:42.860634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:42.860851 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:42.860773 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:43.861831 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:43.861740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:43.862254 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:43.861909 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:44.861140 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:44.861106 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:44.861336 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:44.861215 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:45.860895 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:45.860645 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:45.860895 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:45.860795 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:46.861360 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:46.861277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:46.861769 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:46.861402 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:47.861334 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:47.861294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:47.861517 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:47.861410 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:48.861529 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:48.861493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:48.861965 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:48.861617 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:49.563324 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:49.563293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:49.563484 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.563425 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:49.563484 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.563480 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.563463321 +0000 UTC m=+34.252486400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:23:49.663828 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:49.663792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:49.664000 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.663978 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:23:49.664051 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.664007 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:23:49.664051 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.664021 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:49.664127 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.664086 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.664068024 +0000 UTC m=+34.353091107 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:23:49.861026 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:49.860946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:49.861173 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:49.861095 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:50.861464 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:50.861433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:50.861873 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:50.861534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:51.861988 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:51.861693 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:51.862513 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:51.862055 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:52.002934 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.002893 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-27hl5" event={"ID":"027e0f5b-b1d5-4ef4-a370-2ba4520f5d94","Type":"ContainerStarted","Data":"f47880bb74c0372d430c68c5a23d91706bdf8cea5afdfc7eb4b3010816413ceb"} Apr 22 19:23:52.004430 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.004399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mz5vn" event={"ID":"dbd63835-2911-4f84-8572-eceb35993627","Type":"ContainerStarted","Data":"067f375a1d8dd69bf639cf1cc861e71a4033529fde74bd683fae65aadd850029"} Apr 22 19:23:52.005801 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.005765 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="7f926a73d34be02fe7100f8aaf4dbdae447da6a691d342c68020d74a169c0ff5" exitCode=0 Apr 22 19:23:52.005910 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.005847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"7f926a73d34be02fe7100f8aaf4dbdae447da6a691d342c68020d74a169c0ff5"} Apr 22 19:23:52.007348 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.007325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" event={"ID":"72a6beac-e488-4f4d-8f5b-a6a038b1a99d","Type":"ContainerStarted","Data":"09b6cf443906e921e2bdb2c68637eb6273cdc7716e0434182bcacbd28054660b"} Apr 22 19:23:52.009549 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.009405 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9dhn" event={"ID":"277e1d66-9594-42a8-b953-7fcddeac7dad","Type":"ContainerStarted","Data":"3dabf040622544b0f506820ecf02f75a7b89c179cc271d8915f9ba755a0bcce7"} Apr 22 19:23:52.014444 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:23:52.014759 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014734 2574 generic.go:358] "Generic (PLEG): container finished" podID="001543e3-9932-4f51-a285-c188ebe53071" containerID="70d992a2be65a4a8cba750f2f049500ddff80a78141eca97f79a060198627aa2" exitCode=1 Apr 22 19:23:52.014852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"763d259cc7cab3a115ea5e978653664409e9c508f8afb36a75a34a022bb0fe9b"} Apr 22 19:23:52.014852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"3eb4d5a5fac7c9b6f372d34771c29bd7539d20f0fc633e4af12f5b86f7fa0600"} Apr 22 19:23:52.014852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"f9e05a0e63bbd9a61e56e8328d2638db11d28760d19c347eb4383decba9177db"} Apr 22 19:23:52.014852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"9146e7f251019d3ccbeb8ab551e27d28bb3d50edad04ee04f822c5892852b3f5"} Apr 22 19:23:52.014852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerDied","Data":"70d992a2be65a4a8cba750f2f049500ddff80a78141eca97f79a060198627aa2"} Apr 22 19:23:52.015129 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.014857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"ada6c1341d98b3cc6e78331094c632799165854fdb7a2400093ab1c5321f146c"} Apr 22 19:23:52.016039 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.016019 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" event={"ID":"ed659788-ce5c-4f08-b7a2-84ca2fdda6df","Type":"ContainerStarted","Data":"f83592b4b8cc2ff825297c3db1ef8dbf9bf4fc968cdd59c83e848d7a3c6acb4e"} Apr 22 19:23:52.017451 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.017434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cqg2m" event={"ID":"3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2","Type":"ContainerStarted","Data":"a7c894cfc28355d668e12b48ff0345342688c5c12804e46a332820ec8f421e52"} Apr 22 19:23:52.018640 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.018595 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-27hl5" podStartSLOduration=11.068356178 podStartE2EDuration="20.018581358s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.432829376 +0000 UTC m=+3.121852469" lastFinishedPulling="2026-04-22 19:23:43.383054567 +0000 UTC m=+12.072077649" observedRunningTime="2026-04-22 19:23:52.01849335 +0000 UTC m=+20.707516451" watchObservedRunningTime="2026-04-22 19:23:52.018581358 +0000 UTC m=+20.707604459" Apr 22 19:23:52.108419 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.108375 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cqg2m" podStartSLOduration=3.407437959 podStartE2EDuration="20.108360785s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.429737131 +0000 UTC m=+3.118760218" lastFinishedPulling="2026-04-22 19:23:51.13065996 +0000 UTC m=+19.819683044" observedRunningTime="2026-04-22 19:23:52.108264913 +0000 UTC m=+20.797288013" watchObservedRunningTime="2026-04-22 19:23:52.108360785 +0000 UTC m=+20.797383886" Apr 22 19:23:52.108697 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.108677 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mz5vn" podStartSLOduration=3.367482977 podStartE2EDuration="20.108672342s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.427115251 +0000 UTC m=+3.116138335" lastFinishedPulling="2026-04-22 19:23:51.168304607 +0000 UTC m=+19.857327700" observedRunningTime="2026-04-22 19:23:52.090849814 +0000 UTC m=+20.779872915" watchObservedRunningTime="2026-04-22 19:23:52.108672342 +0000 UTC m=+20.797695443" Apr 22 19:23:52.128563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.128513 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dtzjz" podStartSLOduration=4.388957646 podStartE2EDuration="21.128496722s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.430067723 +0000 UTC m=+3.119090819" lastFinishedPulling="2026-04-22 19:23:51.169606808 +0000 UTC m=+19.858629895" observedRunningTime="2026-04-22 19:23:52.128345422 +0000 UTC m=+20.817368593" watchObservedRunningTime="2026-04-22 19:23:52.128496722 +0000 UTC m=+20.817519823" Apr 22 19:23:52.143047 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.143002 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b9dhn" podStartSLOduration=4.490282872 podStartE2EDuration="21.142987115s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.434852852 +0000 UTC m=+3.123875950" lastFinishedPulling="2026-04-22 19:23:51.087557099 +0000 UTC m=+19.776580193" observedRunningTime="2026-04-22 19:23:52.142905375 +0000 UTC m=+20.831928477" watchObservedRunningTime="2026-04-22 19:23:52.142987115 +0000 UTC m=+20.832010215" Apr 22 19:23:52.811091 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.811062 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:23:52.860775 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:52.860744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:52.860895 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:52.860855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:53.021268 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.021185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" event={"ID":"72a6beac-e488-4f4d-8f5b-a6a038b1a99d","Type":"ContainerStarted","Data":"15292a07feb55a1cfd8dc410049f5be40a8aa8adbffef9814fdbdd249801d811"} Apr 22 19:23:53.022615 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.022588 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5zp6w" event={"ID":"ab5f4389-6ac2-4eab-b05a-9657f9124db1","Type":"ContainerStarted","Data":"0de68d1f8c529f2f96acd94fb1ae1aa32788938c6379f74e1920697552312cfa"} Apr 22 19:23:53.038447 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.038407 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5zp6w" podStartSLOduration=5.336977675 podStartE2EDuration="22.038393532s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.428860601 +0000 UTC m=+3.117883690" lastFinishedPulling="2026-04-22 19:23:51.130276451 +0000 UTC m=+19.819299547" observedRunningTime="2026-04-22 19:23:53.038309088 +0000 UTC m=+21.727332190" watchObservedRunningTime="2026-04-22 19:23:53.038393532 +0000 UTC m=+21.727416634" Apr 22 19:23:53.799884 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.799792 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:23:52.811085746Z","UUID":"8ab55a4c-c17e-4264-baba-d4393abfa41b","Handler":null,"Name":"","Endpoint":""} Apr 22 19:23:53.801334 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.801299 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:23:53.801334 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.801327 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:23:53.861093 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.861066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:53.861229 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:53.861200 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:53.886070 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.886045 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:53.886690 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:53.886669 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:23:54.026753 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:54.026652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" event={"ID":"72a6beac-e488-4f4d-8f5b-a6a038b1a99d","Type":"ContainerStarted","Data":"155ffc0775fd85b9aa3cd3660b52ae77fca5aac005b9c8cf4094ab95f45db413"} Apr 22 19:23:54.044523 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:54.044471 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-74xsg" podStartSLOduration=3.787321602 podStartE2EDuration="23.04445168s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.435782789 +0000 UTC m=+3.124805882" lastFinishedPulling="2026-04-22 19:23:53.692912872 +0000 UTC m=+22.381935960" observedRunningTime="2026-04-22 19:23:54.044018811 +0000 UTC m=+22.733041913" watchObservedRunningTime="2026-04-22 19:23:54.04445168 +0000 UTC m=+22.733474785" Apr 22 19:23:54.861558 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:54.861313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:54.861758 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:54.861662 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:55.031346 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:55.031317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:23:55.031799 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:55.031756 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"11943fb39c49b3e53afab05629307193604f99c007bbd6982f366d297720cece"} Apr 22 19:23:55.031866 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:55.031802 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:23:55.864393 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:55.864365 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:55.864620 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:55.864485 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:56.861801 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:56.861599 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:56.862470 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:56.861876 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:57.039457 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.039420 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="a18b5dcf90b211fbd7267b5f65f017cc85b5af7fd8e46cc595712bf4035fdc8b" exitCode=0 Apr 22 19:23:57.039602 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.039499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"a18b5dcf90b211fbd7267b5f65f017cc85b5af7fd8e46cc595712bf4035fdc8b"} Apr 22 19:23:57.042510 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.042495 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:23:57.042877 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.042855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"404ab7cc4fb44df60af192f1e858e607aef32efd086140b3d872634eaaa60110"} Apr 22 19:23:57.043191 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.043169 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:57.043297 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.043284 2574 scope.go:117] "RemoveContainer" containerID="70d992a2be65a4a8cba750f2f049500ddff80a78141eca97f79a060198627aa2" Apr 22 19:23:57.058488 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.058471 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:57.863962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:57.863936 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:57.864398 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:57.864033 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:58.046860 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.046819 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="a8e6b24e142d304bc2f0c7edce28819f9980d1216c5edd06a300cd4862ad3009" exitCode=0 Apr 22 19:23:58.047019 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.046909 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"a8e6b24e142d304bc2f0c7edce28819f9980d1216c5edd06a300cd4862ad3009"} Apr 22 19:23:58.050395 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.050372 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:23:58.050835 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.050806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" event={"ID":"001543e3-9932-4f51-a285-c188ebe53071","Type":"ContainerStarted","Data":"a190051b526a106e9454d4f63c0e776931d56dd5b970a709d5b68aa0cd23a4d6"} Apr 22 19:23:58.051053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.051038 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:58.051120 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.051074 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:58.065760 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.065740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:23:58.100344 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.100298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" podStartSLOduration=10.298884446 podStartE2EDuration="27.100287605s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.433564089 +0000 UTC m=+3.122587169" lastFinishedPulling="2026-04-22 19:23:51.234967233 +0000 UTC m=+19.923990328" observedRunningTime="2026-04-22 19:23:58.099858014 +0000 UTC m=+26.788881112" watchObservedRunningTime="2026-04-22 19:23:58.100287605 +0000 UTC m=+26.789310706" Apr 22 19:23:58.145678 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.145653 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fhzh7"] Apr 22 19:23:58.145809 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.145774 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:58.145862 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:58.145847 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:23:58.148378 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.148355 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwxr2"] Apr 22 19:23:58.148480 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:58.148468 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:58.148594 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:58.148574 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:59.054533 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:59.054354 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="7c4ba78e11a8bca4238a09ffffee5183d401ffead647c81c655fcc38ad0747f8" exitCode=0 Apr 22 19:23:59.054985 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:59.054436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"7c4ba78e11a8bca4238a09ffffee5183d401ffead647c81c655fcc38ad0747f8"} Apr 22 19:23:59.861563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:59.861523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:23:59.861818 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:23:59.861524 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:23:59.861818 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:59.861667 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:23:59.861818 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:23:59.861746 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:24:01.328078 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:01.328040 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:24:01.328503 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:01.328199 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:24:01.328948 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:01.328918 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-27hl5" Apr 22 19:24:01.862475 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:01.862427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:01.862639 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:01.862530 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:24:01.862639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:01.862615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:24:01.862789 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:01.862758 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:24:03.861111 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:03.861076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:03.861783 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:03.861208 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhzh7" podUID="189e1287-287e-4d9e-aabb-f15459c7ac43" Apr 22 19:24:03.861783 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:03.861265 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:24:03.861783 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:03.861386 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:24:04.144601 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.144525 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-144.ec2.internal" event="NodeReady" Apr 22 19:24:04.144795 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.144688 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:04.210394 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.210362 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rbr82"] Apr 22 19:24:04.230462 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.230431 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tcrbq"] Apr 22 19:24:04.230638 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.230614 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.236178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.236153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:04.236349 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.236329 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:04.236784 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.236766 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:24:04.239353 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.239297 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rbr82"] Apr 22 19:24:04.239353 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.239332 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tcrbq"] Apr 22 19:24:04.239506 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.239426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.243993 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.243959 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:04.244095 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.244048 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:04.244190 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.244174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:04.244269 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.244253 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:24:04.386420 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9acd301-c698-4dda-b95a-e48e9dfbf761-tmp-dir\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.386420 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxh5\" (UniqueName: \"kubernetes.io/projected/d9acd301-c698-4dda-b95a-e48e9dfbf761-kube-api-access-wcxh5\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.386664 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.386664 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.386664 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9acd301-c698-4dda-b95a-e48e9dfbf761-config-volume\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.386664 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.386504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/79e5fa54-5efe-4b12-a3d1-c4a889502cea-kube-api-access-9zjlr\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.487098 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9acd301-c698-4dda-b95a-e48e9dfbf761-tmp-dir\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.487098 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxh5\" (UniqueName: \"kubernetes.io/projected/d9acd301-c698-4dda-b95a-e48e9dfbf761-kube-api-access-wcxh5\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.487336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.487336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.487336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9acd301-c698-4dda-b95a-e48e9dfbf761-config-volume\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.487336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/79e5fa54-5efe-4b12-a3d1-c4a889502cea-kube-api-access-9zjlr\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.487523 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.487378 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:04.487523 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.487465 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:04.487523 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.487493 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.987469818 +0000 UTC m=+33.676492912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:04.487664 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.487555 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:04.987536758 +0000 UTC m=+33.676559838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:04.487664 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9acd301-c698-4dda-b95a-e48e9dfbf761-tmp-dir\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.487841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.487825 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9acd301-c698-4dda-b95a-e48e9dfbf761-config-volume\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.501577 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.501557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxh5\" (UniqueName: \"kubernetes.io/projected/d9acd301-c698-4dda-b95a-e48e9dfbf761-kube-api-access-wcxh5\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.506532 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.506512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjlr\" (UniqueName: \"kubernetes.io/projected/79e5fa54-5efe-4b12-a3d1-c4a889502cea-kube-api-access-9zjlr\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.989756 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.989702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:04.989756 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:04.989760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:04.990296 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.989868 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:04.990296 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.989879 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:04.990296 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.989939 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.989924701 +0000 UTC m=+34.678947780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:04.990296 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:04.989954 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:05.989947751 +0000 UTC m=+34.678970830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:05.068778 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.068752 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="afafda425f149b0f9b22aaed3807c61955580cd59bc8b892cd0b8b73da12d064" exitCode=0 Apr 22 19:24:05.068886 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.068788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"afafda425f149b0f9b22aaed3807c61955580cd59bc8b892cd0b8b73da12d064"} Apr 22 19:24:05.594373 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.594292 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:24:05.594534 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.594428 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:05.594534 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.594487 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:24:37.594471919 +0000 UTC m=+66.283495003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:05.695090 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.695054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:05.695276 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.695212 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:05.695276 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.695232 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:05.695276 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.695242 2574 projected.go:194] Error preparing data for projected volume kube-api-access-9wmc6 for pod openshift-network-diagnostics/network-check-target-fhzh7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:05.695397 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.695293 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6 podName:189e1287-287e-4d9e-aabb-f15459c7ac43 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:37.695279448 +0000 UTC m=+66.384302528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9wmc6" (UniqueName: "kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6") pod "network-check-target-fhzh7" (UID: "189e1287-287e-4d9e-aabb-f15459c7ac43") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:05.860868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.860789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:24:05.860868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.860812 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:05.865342 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.865314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8qpcz\"" Apr 22 19:24:05.865342 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.865329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:05.865342 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.865338 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:05.865552 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.865360 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:05.865552 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.865369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:24:05.997013 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.996980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:05.997013 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:05.997018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:05.997473 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.997106 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:05.997473 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.997132 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:05.997473 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.997154 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.997140718 +0000 UTC m=+36.686163797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:05.997473 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:05.997195 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:07.997178177 +0000 UTC m=+36.686201270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:06.073239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:06.073205 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb28e67f-4312-4175-a5fa-26a033fdf402" containerID="7a5ae386ff73bdc6a3dec027fede6a80d7a9db3890c41af49be226aff60399d8" exitCode=0 Apr 22 19:24:06.073403 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:06.073268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerDied","Data":"7a5ae386ff73bdc6a3dec027fede6a80d7a9db3890c41af49be226aff60399d8"} Apr 22 19:24:07.077644 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:07.077477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-grxvt" event={"ID":"eb28e67f-4312-4175-a5fa-26a033fdf402","Type":"ContainerStarted","Data":"dea34cd7a81917a4f9110f78fe17b4dc75a8dc7c17d3274aabbd43a120b30d21"} Apr 22 19:24:07.104362 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:07.104039 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-grxvt" podStartSLOduration=5.763195485 podStartE2EDuration="36.104021674s" podCreationTimestamp="2026-04-22 19:23:31 +0000 UTC" firstStartedPulling="2026-04-22 19:23:34.43847287 +0000 UTC m=+3.127495949" lastFinishedPulling="2026-04-22 19:24:04.779299059 +0000 UTC m=+33.468322138" observedRunningTime="2026-04-22 19:24:07.103468229 +0000 UTC m=+35.792491331" watchObservedRunningTime="2026-04-22 19:24:07.104021674 +0000 UTC m=+35.793044775" Apr 22 19:24:08.011446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:08.011358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:08.011446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:08.011401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:08.011618 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:08.011502 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:08.011618 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:08.011510 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:08.011618 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:08.011552 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:12.011538831 +0000 UTC m=+40.700561911 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:08.011618 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:08.011566 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:12.011560222 +0000 UTC m=+40.700583300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:12.038674 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:12.038640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:12.038674 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:12.038675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:12.039221 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:12.038794 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:12.039221 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:12.038816 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:12.039221 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:12.038848 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.038832444 +0000 UTC m=+48.727855523 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:12.039221 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:12.038881 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.038861801 +0000 UTC m=+48.727884880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:20.096423 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:20.096379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:20.096423 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:20.096433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:20.097025 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:20.096547 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:20.097025 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:20.096623 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.0966017 +0000 UTC m=+64.785624787 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:20.097025 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:20.096545 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:20.097025 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:20.096695 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:24:36.096680384 +0000 UTC m=+64.785703462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:30.067137 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:30.067112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rffw9" Apr 22 19:24:36.105090 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:36.105048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:24:36.105090 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:36.105092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:24:36.105574 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:36.105192 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:36.105574 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:36.105195 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:36.105574 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:36.105242 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:08.105227499 +0000 UTC m=+96.794250577 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:24:36.105574 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:36.105255 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:25:08.105249496 +0000 UTC m=+96.794272575 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:24:37.616122 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.616076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:24:37.619086 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.619057 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:37.627319 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:37.627298 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:24:37.627413 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:24:37.627359 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:25:41.627343422 +0000 UTC m=+130.316366504 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : secret "metrics-daemon-secret" not found Apr 22 19:24:37.717308 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.717265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:37.720201 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.720184 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:37.730239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.730219 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:37.741703 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.741681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmc6\" (UniqueName: \"kubernetes.io/projected/189e1287-287e-4d9e-aabb-f15459c7ac43-kube-api-access-9wmc6\") pod \"network-check-target-fhzh7\" (UID: \"189e1287-287e-4d9e-aabb-f15459c7ac43\") " pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:37.987503 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.987472 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8qpcz\"" Apr 22 19:24:37.994799 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:37.994781 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:38.114167 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:38.114135 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fhzh7"] Apr 22 19:24:38.118129 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:24:38.118098 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189e1287_287e_4d9e_aabb_f15459c7ac43.slice/crio-faca9ab72927f8793088bba48eb59b9e97bfc96b5b91bf424c91dc655190246f WatchSource:0}: Error finding container faca9ab72927f8793088bba48eb59b9e97bfc96b5b91bf424c91dc655190246f: Status 404 returned error can't find the container with id faca9ab72927f8793088bba48eb59b9e97bfc96b5b91bf424c91dc655190246f Apr 22 19:24:38.134709 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:38.134684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhzh7" event={"ID":"189e1287-287e-4d9e-aabb-f15459c7ac43","Type":"ContainerStarted","Data":"faca9ab72927f8793088bba48eb59b9e97bfc96b5b91bf424c91dc655190246f"} Apr 22 19:24:41.141285 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:41.141197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhzh7" event={"ID":"189e1287-287e-4d9e-aabb-f15459c7ac43","Type":"ContainerStarted","Data":"a42aa96a12926be3e95ebccace8334d6c830f0836ebf2e4c1f48ef1bfcc5f537"} Apr 22 19:24:41.141622 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:41.141322 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:24:41.157176 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:24:41.157130 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fhzh7" podStartSLOduration=66.55650542 podStartE2EDuration="1m9.157118236s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:24:38.120516411 +0000 UTC m=+66.809539494" lastFinishedPulling="2026-04-22 19:24:40.721129228 +0000 UTC m=+69.410152310" observedRunningTime="2026-04-22 19:24:41.156655028 +0000 UTC m=+69.845678129" watchObservedRunningTime="2026-04-22 19:24:41.157118236 +0000 UTC m=+69.846141352" Apr 22 19:25:08.128831 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:08.128743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:25:08.128831 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:08.128823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:25:08.129287 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:08.128899 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:08.129287 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:08.128956 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert podName:79e5fa54-5efe-4b12-a3d1-c4a889502cea nodeName:}" failed. No retries permitted until 2026-04-22 19:26:12.128941734 +0000 UTC m=+160.817964813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert") pod "ingress-canary-tcrbq" (UID: "79e5fa54-5efe-4b12-a3d1-c4a889502cea") : secret "canary-serving-cert" not found Apr 22 19:25:08.129287 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:08.128897 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:08.129287 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:08.129055 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls podName:d9acd301-c698-4dda-b95a-e48e9dfbf761 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:12.12904051 +0000 UTC m=+160.818063589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls") pod "dns-default-rbr82" (UID: "d9acd301-c698-4dda-b95a-e48e9dfbf761") : secret "dns-default-metrics-tls" not found Apr 22 19:25:12.145447 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:12.145416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhzh7" Apr 22 19:25:36.555222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.555186 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dnrn8"] Apr 22 19:25:36.559388 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.559367 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.565479 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.565452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.566820 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.566795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.566944 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.566823 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:25:36.567282 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.567060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-n4fq8\"" Apr 22 19:25:36.569051 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.568264 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:25:36.571514 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.571494 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:25:36.572784 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.572765 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dnrn8"] Apr 22 19:25:36.617934 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.617902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.617934 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.617934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b468l\" (UniqueName: \"kubernetes.io/projected/38aa17f7-ede3-4506-a743-892303b3d9b7-kube-api-access-b468l\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.618178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.617968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-snapshots\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.618178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.618064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-tmp\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.618178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.618097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.618178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.618124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38aa17f7-ede3-4506-a743-892303b3d9b7-serving-cert\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.644932 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.644898 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp"] Apr 22 19:25:36.647618 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.647604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" Apr 22 19:25:36.652145 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.652125 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.652258 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.652148 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.652258 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.652160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-4vf25\"" Apr 22 19:25:36.661053 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.661029 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:25:36.663969 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.663950 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f"] Apr 22 19:25:36.664084 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.664070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.666814 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.666661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp"] Apr 22 19:25:36.666911 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.666826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.667928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.667912 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8fmt6\"" Apr 22 19:25:36.668076 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.668053 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:25:36.668231 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.668138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:25:36.668592 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.668573 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:25:36.669625 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.669598 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:25:36.669735 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.669699 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:25:36.670694 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.670672 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:25:36.670807 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.670710 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-xqtxd\"" Apr 22 19:25:36.670807 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.670790 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:25:36.675239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.675222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:25:36.681367 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.681345 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:25:36.683910 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.683892 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f"] Apr 22 19:25:36.719464 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.719616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnvsd\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.719616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.719616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.719616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.719616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad02c8-88c6-424e-8f61-266e366dcbba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qssbn\" (UniqueName: \"kubernetes.io/projected/8a37d76d-bb5f-4997-83f8-5da1496ba0e9-kube-api-access-qssbn\") pod \"volume-data-source-validator-7c6cbb6c87-bf5vp\" (UID: \"8a37d76d-bb5f-4997-83f8-5da1496ba0e9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b468l\" (UniqueName: \"kubernetes.io/projected/38aa17f7-ede3-4506-a743-892303b3d9b7-kube-api-access-b468l\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48n9\" (UniqueName: \"kubernetes.io/projected/f6ad02c8-88c6-424e-8f61-266e366dcbba-kube-api-access-m48n9\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad02c8-88c6-424e-8f61-266e366dcbba-config\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.719841 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-snapshots\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719940 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-tmp\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.719975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38aa17f7-ede3-4506-a743-892303b3d9b7-serving-cert\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.720382 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720382 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720374 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-tmp\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720476 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/38aa17f7-ede3-4506-a743-892303b3d9b7-snapshots\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.720817 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.720800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38aa17f7-ede3-4506-a743-892303b3d9b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.722290 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.722270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38aa17f7-ede3-4506-a743-892303b3d9b7-serving-cert\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.734944 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.734918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b468l\" (UniqueName: \"kubernetes.io/projected/38aa17f7-ede3-4506-a743-892303b3d9b7-kube-api-access-b468l\") pod \"insights-operator-585dfdc468-dnrn8\" (UID: \"38aa17f7-ede3-4506-a743-892303b3d9b7\") " pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.821380 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821380 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnvsd\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad02c8-88c6-424e-8f61-266e366dcbba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.821585 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qssbn\" (UniqueName: \"kubernetes.io/projected/8a37d76d-bb5f-4997-83f8-5da1496ba0e9-kube-api-access-qssbn\") pod \"volume-data-source-validator-7c6cbb6c87-bf5vp\" (UID: \"8a37d76d-bb5f-4997-83f8-5da1496ba0e9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" Apr 22 19:25:36.821861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m48n9\" (UniqueName: \"kubernetes.io/projected/f6ad02c8-88c6-424e-8f61-266e366dcbba-kube-api-access-m48n9\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.821861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821680 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad02c8-88c6-424e-8f61-266e366dcbba-config\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.821861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.821861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.822071 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.821911 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.822071 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:36.821942 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:36.822071 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:36.821956 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74cf6f6fcc-qk8ss: secret "image-registry-tls" not found Apr 22 19:25:36.822071 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:36.822014 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls podName:90a338b4-011e-41a1-9e5a-eeb91dc3dc21 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:37.321995656 +0000 UTC m=+126.011018736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls") pod "image-registry-74cf6f6fcc-qk8ss" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21") : secret "image-registry-tls" not found Apr 22 19:25:36.822378 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.822356 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.822778 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.822757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ad02c8-88c6-424e-8f61-266e366dcbba-config\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.822876 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.822855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.823957 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.823933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ad02c8-88c6-424e-8f61-266e366dcbba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.824039 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.823975 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.824547 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.824527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.832042 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.832017 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48n9\" (UniqueName: \"kubernetes.io/projected/f6ad02c8-88c6-424e-8f61-266e366dcbba-kube-api-access-m48n9\") pod \"service-ca-operator-d6fc45fc5-5zk6f\" (UID: \"f6ad02c8-88c6-424e-8f61-266e366dcbba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.832130 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.832071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.832130 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.832087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qssbn\" (UniqueName: \"kubernetes.io/projected/8a37d76d-bb5f-4997-83f8-5da1496ba0e9-kube-api-access-qssbn\") pod \"volume-data-source-validator-7c6cbb6c87-bf5vp\" (UID: \"8a37d76d-bb5f-4997-83f8-5da1496ba0e9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" Apr 22 19:25:36.832243 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.832227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnvsd\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:36.868412 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.868393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" Apr 22 19:25:36.956282 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.956255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" Apr 22 19:25:36.978830 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.978801 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dnrn8"] Apr 22 19:25:36.980950 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:36.980920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" Apr 22 19:25:36.981737 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:25:36.981700 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38aa17f7_ede3_4506_a743_892303b3d9b7.slice/crio-1749253a48b6ecc1bf688ad3280fd6a36b6aeea7fc21fbe5f592213e979cda66 WatchSource:0}: Error finding container 1749253a48b6ecc1bf688ad3280fd6a36b6aeea7fc21fbe5f592213e979cda66: Status 404 returned error can't find the container with id 1749253a48b6ecc1bf688ad3280fd6a36b6aeea7fc21fbe5f592213e979cda66 Apr 22 19:25:37.087435 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.087235 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp"] Apr 22 19:25:37.089315 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:25:37.089286 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a37d76d_bb5f_4997_83f8_5da1496ba0e9.slice/crio-e4bfa9b1d8a483ca3d849dca8ebbaf6326bae5c7e42b7f9253434f7ff2aaf6bc WatchSource:0}: Error finding container e4bfa9b1d8a483ca3d849dca8ebbaf6326bae5c7e42b7f9253434f7ff2aaf6bc: Status 404 returned error can't find the container with id e4bfa9b1d8a483ca3d849dca8ebbaf6326bae5c7e42b7f9253434f7ff2aaf6bc Apr 22 19:25:37.100238 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.100215 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f"] Apr 22 19:25:37.102471 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:25:37.102445 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ad02c8_88c6_424e_8f61_266e366dcbba.slice/crio-e0b1f684efa80cdf441765a8b193c70a6f8a17a2f83fb03b23b9912abb40aeb7 WatchSource:0}: Error finding container e0b1f684efa80cdf441765a8b193c70a6f8a17a2f83fb03b23b9912abb40aeb7: Status 404 returned error can't find the container with id e0b1f684efa80cdf441765a8b193c70a6f8a17a2f83fb03b23b9912abb40aeb7 Apr 22 19:25:37.250430 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.250389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" event={"ID":"38aa17f7-ede3-4506-a743-892303b3d9b7","Type":"ContainerStarted","Data":"1749253a48b6ecc1bf688ad3280fd6a36b6aeea7fc21fbe5f592213e979cda66"} Apr 22 19:25:37.251263 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.251240 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" event={"ID":"8a37d76d-bb5f-4997-83f8-5da1496ba0e9","Type":"ContainerStarted","Data":"e4bfa9b1d8a483ca3d849dca8ebbaf6326bae5c7e42b7f9253434f7ff2aaf6bc"} Apr 22 19:25:37.252152 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.252133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" event={"ID":"f6ad02c8-88c6-424e-8f61-266e366dcbba","Type":"ContainerStarted","Data":"e0b1f684efa80cdf441765a8b193c70a6f8a17a2f83fb03b23b9912abb40aeb7"} Apr 22 19:25:37.326894 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:37.326863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:37.327024 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:37.327011 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:37.327024 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:37.327026 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74cf6f6fcc-qk8ss: secret "image-registry-tls" not found Apr 22 19:25:37.327134 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:37.327083 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls podName:90a338b4-011e-41a1-9e5a-eeb91dc3dc21 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:38.327065593 +0000 UTC m=+127.016088694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls") pod "image-registry-74cf6f6fcc-qk8ss" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21") : secret "image-registry-tls" not found Apr 22 19:25:38.335169 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:38.335130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:38.335648 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:38.335286 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:38.335648 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:38.335305 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74cf6f6fcc-qk8ss: secret "image-registry-tls" not found Apr 22 19:25:38.335648 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:38.335387 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls podName:90a338b4-011e-41a1-9e5a-eeb91dc3dc21 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:40.335367656 +0000 UTC m=+129.024390757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls") pod "image-registry-74cf6f6fcc-qk8ss" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21") : secret "image-registry-tls" not found Apr 22 19:25:40.259701 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.259666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" event={"ID":"38aa17f7-ede3-4506-a743-892303b3d9b7","Type":"ContainerStarted","Data":"22adb9701ee424d8556054f8f79040dc46e44a4db148a6ad58b18cac83676d78"} Apr 22 19:25:40.260986 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.260958 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" event={"ID":"8a37d76d-bb5f-4997-83f8-5da1496ba0e9","Type":"ContainerStarted","Data":"dd613e29a1d5491cce14527e60d5758259e89252613e40e16eeecdf05e5404b4"} Apr 22 19:25:40.262146 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.262124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" event={"ID":"f6ad02c8-88c6-424e-8f61-266e366dcbba","Type":"ContainerStarted","Data":"47ec169b48d3c7bc174425eb16fca1951b208c0c704795d40075a93b5765964f"} Apr 22 19:25:40.277517 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.277469 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" podStartSLOduration=1.870092315 podStartE2EDuration="4.277454863s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:36.985140684 +0000 UTC m=+125.674163766" lastFinishedPulling="2026-04-22 19:25:39.392503234 +0000 UTC m=+128.081526314" observedRunningTime="2026-04-22 19:25:40.27617359 +0000 UTC m=+128.965196691" watchObservedRunningTime="2026-04-22 19:25:40.277454863 +0000 UTC m=+128.966477965" Apr 22 19:25:40.294404 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.294363 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bf5vp" podStartSLOduration=1.99648212 podStartE2EDuration="4.29435061s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:37.091106983 +0000 UTC m=+125.780130066" lastFinishedPulling="2026-04-22 19:25:39.388975473 +0000 UTC m=+128.077998556" observedRunningTime="2026-04-22 19:25:40.291753713 +0000 UTC m=+128.980776814" watchObservedRunningTime="2026-04-22 19:25:40.29435061 +0000 UTC m=+128.983373710" Apr 22 19:25:40.309346 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.309307 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" podStartSLOduration=2.017375431 podStartE2EDuration="4.309295032s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="2026-04-22 19:25:37.104106249 +0000 UTC m=+125.793129328" lastFinishedPulling="2026-04-22 19:25:39.396025835 +0000 UTC m=+128.085048929" observedRunningTime="2026-04-22 19:25:40.308696393 +0000 UTC m=+128.997719498" watchObservedRunningTime="2026-04-22 19:25:40.309295032 +0000 UTC m=+128.998318132" Apr 22 19:25:40.350082 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.350047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:40.350913 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:40.350329 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:40.350913 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:40.350353 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74cf6f6fcc-qk8ss: secret "image-registry-tls" not found Apr 22 19:25:40.350913 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:40.350410 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls podName:90a338b4-011e-41a1-9e5a-eeb91dc3dc21 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:44.350392631 +0000 UTC m=+133.039415735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls") pod "image-registry-74cf6f6fcc-qk8ss" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21") : secret "image-registry-tls" not found Apr 22 19:25:40.972668 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.972633 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b"] Apr 22 19:25:40.975748 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.975733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" Apr 22 19:25:40.978745 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.978713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mjfnw\"" Apr 22 19:25:40.986566 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:40.986544 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b"] Apr 22 19:25:41.054858 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.054817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zrmt\" (UniqueName: \"kubernetes.io/projected/f5bffc3c-5e52-4276-99b2-b8ec48bdd66a-kube-api-access-4zrmt\") pod \"network-check-source-8894fc9bd-lvq7b\" (UID: \"f5bffc3c-5e52-4276-99b2-b8ec48bdd66a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" Apr 22 19:25:41.155439 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.155361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zrmt\" (UniqueName: \"kubernetes.io/projected/f5bffc3c-5e52-4276-99b2-b8ec48bdd66a-kube-api-access-4zrmt\") pod \"network-check-source-8894fc9bd-lvq7b\" (UID: \"f5bffc3c-5e52-4276-99b2-b8ec48bdd66a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" Apr 22 19:25:41.163131 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.163104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zrmt\" (UniqueName: \"kubernetes.io/projected/f5bffc3c-5e52-4276-99b2-b8ec48bdd66a-kube-api-access-4zrmt\") pod \"network-check-source-8894fc9bd-lvq7b\" (UID: \"f5bffc3c-5e52-4276-99b2-b8ec48bdd66a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" Apr 22 19:25:41.284160 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.284077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" Apr 22 19:25:41.397856 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.397823 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b"] Apr 22 19:25:41.400530 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:25:41.400502 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5bffc3c_5e52_4276_99b2_b8ec48bdd66a.slice/crio-2da3a95ba6f52a0d80cf792fc3b0fbb4b491768b33b1641ac50d2efdfcb726f5 WatchSource:0}: Error finding container 2da3a95ba6f52a0d80cf792fc3b0fbb4b491768b33b1641ac50d2efdfcb726f5: Status 404 returned error can't find the container with id 2da3a95ba6f52a0d80cf792fc3b0fbb4b491768b33b1641ac50d2efdfcb726f5 Apr 22 19:25:41.658663 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:41.658630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:25:41.658836 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:41.658808 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:41.658886 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:41.658875 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs podName:3f7c0766-21b4-4016-9a86-f022651a4b2e nodeName:}" failed. No retries permitted until 2026-04-22 19:27:43.658860257 +0000 UTC m=+252.347883339 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs") pod "network-metrics-daemon-rwxr2" (UID: "3f7c0766-21b4-4016-9a86-f022651a4b2e") : secret "metrics-daemon-secret" not found Apr 22 19:25:42.267788 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:42.267750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" event={"ID":"f5bffc3c-5e52-4276-99b2-b8ec48bdd66a","Type":"ContainerStarted","Data":"558c36afb7d9f7ee37c6402b7e35d7f34af65beae352cde860ce926b6f4e1bf6"} Apr 22 19:25:42.267788 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:42.267792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" event={"ID":"f5bffc3c-5e52-4276-99b2-b8ec48bdd66a","Type":"ContainerStarted","Data":"2da3a95ba6f52a0d80cf792fc3b0fbb4b491768b33b1641ac50d2efdfcb726f5"} Apr 22 19:25:42.285099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:42.285057 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lvq7b" podStartSLOduration=2.285043478 podStartE2EDuration="2.285043478s" podCreationTimestamp="2026-04-22 19:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:42.284644692 +0000 UTC m=+130.973667811" watchObservedRunningTime="2026-04-22 19:25:42.285043478 +0000 UTC m=+130.974066579" Apr 22 19:25:43.745844 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:43.745816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqg2m_3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2/dns-node-resolver/0.log" Apr 22 19:25:44.380313 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:44.380253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:44.380509 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:44.380380 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:25:44.380509 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:44.380392 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-74cf6f6fcc-qk8ss: secret "image-registry-tls" not found Apr 22 19:25:44.380509 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:25:44.380441 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls podName:90a338b4-011e-41a1-9e5a-eeb91dc3dc21 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:52.380428058 +0000 UTC m=+141.069451138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls") pod "image-registry-74cf6f6fcc-qk8ss" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21") : secret "image-registry-tls" not found Apr 22 19:25:44.943870 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:44.943842 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b9dhn_277e1d66-9594-42a8-b953-7fcddeac7dad/node-ca/0.log" Apr 22 19:25:52.440126 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:52.440083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:52.442425 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:52.442395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"image-registry-74cf6f6fcc-qk8ss\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:52.575528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:52.575486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:52.698323 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:52.698250 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:25:52.701150 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:25:52.701119 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a338b4_011e_41a1_9e5a_eeb91dc3dc21.slice/crio-3ff9890caa4f26b3d829a36dc20724470d615371dd1ac665247aba563558762c WatchSource:0}: Error finding container 3ff9890caa4f26b3d829a36dc20724470d615371dd1ac665247aba563558762c: Status 404 returned error can't find the container with id 3ff9890caa4f26b3d829a36dc20724470d615371dd1ac665247aba563558762c Apr 22 19:25:53.298856 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:53.298824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" event={"ID":"90a338b4-011e-41a1-9e5a-eeb91dc3dc21","Type":"ContainerStarted","Data":"1502e39921155e432202f571e99da78f4c673d7c3b68a9a1c173c781a0c0e463"} Apr 22 19:25:53.298856 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:53.298858 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" event={"ID":"90a338b4-011e-41a1-9e5a-eeb91dc3dc21","Type":"ContainerStarted","Data":"3ff9890caa4f26b3d829a36dc20724470d615371dd1ac665247aba563558762c"} Apr 22 19:25:53.299061 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:53.298971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:25:53.319846 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:25:53.319796 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" podStartSLOduration=17.319782219 podStartE2EDuration="17.319782219s" podCreationTimestamp="2026-04-22 19:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:25:53.319397542 +0000 UTC m=+142.008420654" watchObservedRunningTime="2026-04-22 19:25:53.319782219 +0000 UTC m=+142.008805319" Apr 22 19:26:04.248897 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.248863 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sbxjf"] Apr 22 19:26:04.253994 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.253975 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.257795 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.257771 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2cndx\"" Apr 22 19:26:04.257920 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.257896 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:04.258975 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.258949 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:04.271928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.271906 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sbxjf"] Apr 22 19:26:04.276596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.276573 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:26:04.351747 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.351704 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f46764956-vbbf4"] Apr 22 19:26:04.354695 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.354680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.383336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.383307 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f46764956-vbbf4"] Apr 22 19:26:04.427639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.427611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dxt\" (UniqueName: \"kubernetes.io/projected/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-api-access-x9dxt\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.427803 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.427652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-crio-socket\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.427803 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.427714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.427885 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.427814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-data-volume\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.427885 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.427846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.528759 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-registry-tls\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.528759 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528702 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-image-registry-private-configuration\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.528929 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-data-volume\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.528929 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.528929 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-registry-certificates\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.528929 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dxt\" (UniqueName: \"kubernetes.io/projected/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-api-access-x9dxt\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.529074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-installation-pull-secrets\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.529074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.528995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b445d\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-kube-api-access-b445d\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.529074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-trusted-ca\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.529179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-crio-socket\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.529179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-data-volume\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.529179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-bound-sa-token\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.529179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-crio-socket\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.529179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/675eace7-6336-4022-a08d-68006d2cbe80-ca-trust-extracted\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.529340 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.529548 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.529530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.531493 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.531472 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.555154 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.555130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dxt\" (UniqueName: \"kubernetes.io/projected/eaab3bc0-fc81-44b7-83fc-a0b27939da1a-kube-api-access-x9dxt\") pod \"insights-runtime-extractor-sbxjf\" (UID: \"eaab3bc0-fc81-44b7-83fc-a0b27939da1a\") " pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.562297 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.562281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sbxjf" Apr 22 19:26:04.630209 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-bound-sa-token\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630209 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/675eace7-6336-4022-a08d-68006d2cbe80-ca-trust-extracted\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630484 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-registry-tls\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630540 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-image-registry-private-configuration\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630583 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-registry-certificates\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630626 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-installation-pull-secrets\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630672 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b445d\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-kube-api-access-b445d\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630672 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-trusted-ca\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.630990 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.630949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/675eace7-6336-4022-a08d-68006d2cbe80-ca-trust-extracted\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.631432 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.631411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-registry-certificates\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.631771 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.631709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/675eace7-6336-4022-a08d-68006d2cbe80-trusted-ca\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.633512 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.633420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-installation-pull-secrets\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.633683 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.633568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/675eace7-6336-4022-a08d-68006d2cbe80-image-registry-private-configuration\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.633907 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.633859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-registry-tls\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.639003 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.638960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-bound-sa-token\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.640823 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.640798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b445d\" (UniqueName: \"kubernetes.io/projected/675eace7-6336-4022-a08d-68006d2cbe80-kube-api-access-b445d\") pod \"image-registry-6f46764956-vbbf4\" (UID: \"675eace7-6336-4022-a08d-68006d2cbe80\") " pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.663067 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.663047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:04.688170 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.688122 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sbxjf"] Apr 22 19:26:04.691637 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:04.691611 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaab3bc0_fc81_44b7_83fc_a0b27939da1a.slice/crio-f751e3f9b0431f4f2031b5ad78fee040eed7c204926f6af62df165b364cfa0c7 WatchSource:0}: Error finding container f751e3f9b0431f4f2031b5ad78fee040eed7c204926f6af62df165b364cfa0c7: Status 404 returned error can't find the container with id f751e3f9b0431f4f2031b5ad78fee040eed7c204926f6af62df165b364cfa0c7 Apr 22 19:26:04.791736 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:04.791642 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f46764956-vbbf4"] Apr 22 19:26:04.795410 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:04.795385 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod675eace7_6336_4022_a08d_68006d2cbe80.slice/crio-6e59e8aa43331bcc37a86705852ae8e145194c395c4b9c052ba0ec7cd5003336 WatchSource:0}: Error finding container 6e59e8aa43331bcc37a86705852ae8e145194c395c4b9c052ba0ec7cd5003336: Status 404 returned error can't find the container with id 6e59e8aa43331bcc37a86705852ae8e145194c395c4b9c052ba0ec7cd5003336 Apr 22 19:26:05.328644 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.328613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" event={"ID":"675eace7-6336-4022-a08d-68006d2cbe80","Type":"ContainerStarted","Data":"9354e1fa1b36ab30604a4cb214eca801275c1e439896be75b4b23b5a469324cc"} Apr 22 19:26:05.329027 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.328655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" event={"ID":"675eace7-6336-4022-a08d-68006d2cbe80","Type":"ContainerStarted","Data":"6e59e8aa43331bcc37a86705852ae8e145194c395c4b9c052ba0ec7cd5003336"} Apr 22 19:26:05.329027 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.328684 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:05.329911 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.329891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbxjf" event={"ID":"eaab3bc0-fc81-44b7-83fc-a0b27939da1a","Type":"ContainerStarted","Data":"258eaefb6013f7e2050c70e1d9d9cfae95d0c48a8247089a71c1192ee7466c48"} Apr 22 19:26:05.329988 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.329918 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbxjf" event={"ID":"eaab3bc0-fc81-44b7-83fc-a0b27939da1a","Type":"ContainerStarted","Data":"f751e3f9b0431f4f2031b5ad78fee040eed7c204926f6af62df165b364cfa0c7"} Apr 22 19:26:05.350550 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:05.350513 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" podStartSLOduration=1.350498256 podStartE2EDuration="1.350498256s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:05.34988908 +0000 UTC m=+154.038912182" watchObservedRunningTime="2026-04-22 19:26:05.350498256 +0000 UTC m=+154.039521385" Apr 22 19:26:06.338888 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:06.338843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbxjf" event={"ID":"eaab3bc0-fc81-44b7-83fc-a0b27939da1a","Type":"ContainerStarted","Data":"795e6c06eebc35bbdc1cf798043ff8e5a3dbc8460bb14c3260e2139bd502490d"} Apr 22 19:26:07.243521 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:07.243476 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rbr82" podUID="d9acd301-c698-4dda-b95a-e48e9dfbf761" Apr 22 19:26:07.250632 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:07.250586 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tcrbq" podUID="79e5fa54-5efe-4b12-a3d1-c4a889502cea" Apr 22 19:26:07.342938 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:07.342905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:26:07.342938 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:07.342897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sbxjf" event={"ID":"eaab3bc0-fc81-44b7-83fc-a0b27939da1a","Type":"ContainerStarted","Data":"a4350d6e7d5c96c1e2eb9aa9dadef7b87b0952f0ca9c751d0d2ef6ce1547921f"} Apr 22 19:26:07.343403 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:07.343172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:07.360226 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:07.360181 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sbxjf" podStartSLOduration=1.432462337 podStartE2EDuration="3.36017032s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:04.745362326 +0000 UTC m=+153.434385405" lastFinishedPulling="2026-04-22 19:26:06.673070306 +0000 UTC m=+155.362093388" observedRunningTime="2026-04-22 19:26:07.360067971 +0000 UTC m=+156.049091085" watchObservedRunningTime="2026-04-22 19:26:07.36017032 +0000 UTC m=+156.049193420" Apr 22 19:26:08.870571 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:08.870535 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rwxr2" podUID="3f7c0766-21b4-4016-9a86-f022651a4b2e" Apr 22 19:26:12.193276 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.193190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:12.193276 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.193251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:26:12.195487 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.195463 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9acd301-c698-4dda-b95a-e48e9dfbf761-metrics-tls\") pod \"dns-default-rbr82\" (UID: \"d9acd301-c698-4dda-b95a-e48e9dfbf761\") " pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:12.195619 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.195602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/79e5fa54-5efe-4b12-a3d1-c4a889502cea-cert\") pod \"ingress-canary-tcrbq\" (UID: \"79e5fa54-5efe-4b12-a3d1-c4a889502cea\") " pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:26:12.446990 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.446916 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6f5sr\"" Apr 22 19:26:12.448019 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.448002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrkhq\"" Apr 22 19:26:12.455226 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.455210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:12.455304 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.455284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tcrbq" Apr 22 19:26:12.598136 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.598104 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tcrbq"] Apr 22 19:26:12.601398 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:12.601366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e5fa54_5efe_4b12_a3d1_c4a889502cea.slice/crio-fb2accb4fdc989374db6655fb30c83aa3e922fb9fc3a3ea8b0c7c60eae2283c2 WatchSource:0}: Error finding container fb2accb4fdc989374db6655fb30c83aa3e922fb9fc3a3ea8b0c7c60eae2283c2: Status 404 returned error can't find the container with id fb2accb4fdc989374db6655fb30c83aa3e922fb9fc3a3ea8b0c7c60eae2283c2 Apr 22 19:26:12.613217 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:12.613193 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rbr82"] Apr 22 19:26:12.615618 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:12.615591 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9acd301_c698_4dda_b95a_e48e9dfbf761.slice/crio-8bf5d8f9ba0cb1a6514f0099589e5f3ba19f142499fea399fbea0b66c8981c6e WatchSource:0}: Error finding container 8bf5d8f9ba0cb1a6514f0099589e5f3ba19f142499fea399fbea0b66c8981c6e: Status 404 returned error can't find the container with id 8bf5d8f9ba0cb1a6514f0099589e5f3ba19f142499fea399fbea0b66c8981c6e Apr 22 19:26:13.359904 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:13.359863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tcrbq" event={"ID":"79e5fa54-5efe-4b12-a3d1-c4a889502cea","Type":"ContainerStarted","Data":"fb2accb4fdc989374db6655fb30c83aa3e922fb9fc3a3ea8b0c7c60eae2283c2"} Apr 22 19:26:13.361165 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:13.361135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbr82" event={"ID":"d9acd301-c698-4dda-b95a-e48e9dfbf761","Type":"ContainerStarted","Data":"8bf5d8f9ba0cb1a6514f0099589e5f3ba19f142499fea399fbea0b66c8981c6e"} Apr 22 19:26:14.282905 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:14.282860 2574 patch_prober.go:28] interesting pod/image-registry-74cf6f6fcc-qk8ss container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:26:14.283085 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:14.282935 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:26:15.367671 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.367624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tcrbq" event={"ID":"79e5fa54-5efe-4b12-a3d1-c4a889502cea","Type":"ContainerStarted","Data":"b7519c11a5a4d3c42dcd01f251da980a88711b2f794b1423683c890a68d3eac5"} Apr 22 19:26:15.369044 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.369019 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbr82" event={"ID":"d9acd301-c698-4dda-b95a-e48e9dfbf761","Type":"ContainerStarted","Data":"828dbb9e6a857f31446e669d9f87d24fbb5d4b4ba4c2061f4a171cbc294694c0"} Apr 22 19:26:15.369177 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.369048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rbr82" event={"ID":"d9acd301-c698-4dda-b95a-e48e9dfbf761","Type":"ContainerStarted","Data":"1c11528954f35e30fbe17c9fbec0bceb5102d35dbba6a6b91d9cffa1aca6c0db"} Apr 22 19:26:15.369177 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.369137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:15.388470 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.388425 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tcrbq" podStartSLOduration=129.277586448 podStartE2EDuration="2m11.388412629s" podCreationTimestamp="2026-04-22 19:24:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:12.603482125 +0000 UTC m=+161.292505203" lastFinishedPulling="2026-04-22 19:26:14.714308298 +0000 UTC m=+163.403331384" observedRunningTime="2026-04-22 19:26:15.387431986 +0000 UTC m=+164.076455087" watchObservedRunningTime="2026-04-22 19:26:15.388412629 +0000 UTC m=+164.077435727" Apr 22 19:26:15.404904 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:15.404856 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rbr82" podStartSLOduration=129.310624039 podStartE2EDuration="2m11.404838892s" podCreationTimestamp="2026-04-22 19:24:04 +0000 UTC" firstStartedPulling="2026-04-22 19:26:12.617376381 +0000 UTC m=+161.306399460" lastFinishedPulling="2026-04-22 19:26:14.71159123 +0000 UTC m=+163.400614313" observedRunningTime="2026-04-22 19:26:15.404559234 +0000 UTC m=+164.093582367" watchObservedRunningTime="2026-04-22 19:26:15.404838892 +0000 UTC m=+164.093861995" Apr 22 19:26:18.990162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.990131 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:26:18.992198 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.992182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:18.994912 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.994894 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:26:18.995405 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.995393 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-brsqm\"" Apr 22 19:26:18.996594 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.996572 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:26:18.997022 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.997005 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:26:18.997645 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.997626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:26:18.997645 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.997642 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:26:18.997820 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.997669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:26:18.997820 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:18.997765 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:26:19.002174 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.002155 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:26:19.005965 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.005946 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:26:19.046215 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046387 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046387 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046387 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046509 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046509 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.046509 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.046445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nwzn\" (UniqueName: \"kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147550 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.147706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.147661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nwzn\" (UniqueName: \"kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.148452 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.148425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.148563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.148428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.148563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.148486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.148563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.148494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.150131 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.150100 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.150222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.150128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.155953 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.155932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nwzn\" (UniqueName: \"kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn\") pod \"console-8699647d9f-kf74x\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.301011 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.300894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:19.426239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:19.426209 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:26:19.428941 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:19.428914 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode393a8cd_0a87_4609_a470_1f8a4549d27b.slice/crio-c9e82285f70ac3cd4583d3602f761afc15c42b852a45fc0365d7a9d4cafa32c9 WatchSource:0}: Error finding container c9e82285f70ac3cd4583d3602f761afc15c42b852a45fc0365d7a9d4cafa32c9: Status 404 returned error can't find the container with id c9e82285f70ac3cd4583d3602f761afc15c42b852a45fc0365d7a9d4cafa32c9 Apr 22 19:26:20.383526 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:20.383484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8699647d9f-kf74x" event={"ID":"e393a8cd-0a87-4609-a470-1f8a4549d27b","Type":"ContainerStarted","Data":"c9e82285f70ac3cd4583d3602f761afc15c42b852a45fc0365d7a9d4cafa32c9"} Apr 22 19:26:21.862762 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:21.862741 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:26:22.390511 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:22.390474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8699647d9f-kf74x" event={"ID":"e393a8cd-0a87-4609-a470-1f8a4549d27b","Type":"ContainerStarted","Data":"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f"} Apr 22 19:26:22.409341 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:22.409296 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8699647d9f-kf74x" podStartSLOduration=2.026022304 podStartE2EDuration="4.409283597s" podCreationTimestamp="2026-04-22 19:26:18 +0000 UTC" firstStartedPulling="2026-04-22 19:26:19.432148618 +0000 UTC m=+168.121171697" lastFinishedPulling="2026-04-22 19:26:21.815409912 +0000 UTC m=+170.504432990" observedRunningTime="2026-04-22 19:26:22.408617663 +0000 UTC m=+171.097640776" watchObservedRunningTime="2026-04-22 19:26:22.409283597 +0000 UTC m=+171.098306697" Apr 22 19:26:24.280696 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:24.280665 2574 patch_prober.go:28] interesting pod/image-registry-74cf6f6fcc-qk8ss container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:26:24.281073 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:24.280739 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:26:24.667273 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:24.667244 2574 patch_prober.go:28] interesting pod/image-registry-6f46764956-vbbf4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:26:24.667426 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:24.667294 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" podUID="675eace7-6336-4022-a08d-68006d2cbe80" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:26:25.373386 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:25.373359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rbr82" Apr 22 19:26:26.343403 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:26.343370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f46764956-vbbf4" Apr 22 19:26:27.339789 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.339757 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw"] Apr 22 19:26:27.341843 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.341826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.344527 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.344499 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:26:27.344637 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.344603 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:26:27.344695 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.344662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:26:27.345679 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.345654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:26:27.345793 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.345711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-n28z5\"" Apr 22 19:26:27.345793 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.345776 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:26:27.354987 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.354965 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw"] Apr 22 19:26:27.367351 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.367329 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sbxm6"] Apr 22 19:26:27.369495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.369469 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.374180 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.374159 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:26:27.374266 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.374213 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-v96mz\"" Apr 22 19:26:27.374471 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.374457 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:26:27.375351 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.375337 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:26:27.409736 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-accelerators-collector-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-metrics-client-ca\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-sys\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-textfile\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409831 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaecdab7-4807-4bae-baab-35520e039402-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409856 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.409878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dspgx\" (UniqueName: \"kubernetes.io/projected/eef75401-5fc2-4705-9f86-365e393d4977-kube-api-access-dspgx\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.410101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.409944 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-root\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.410101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.410003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2p\" (UniqueName: \"kubernetes.io/projected/aaecdab7-4807-4bae-baab-35520e039402-kube-api-access-4js2p\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.410101 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.410041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-wtmp\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.410189 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.410103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.410189 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.410131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.511104 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.511104 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-accelerators-collector-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-metrics-client-ca\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-sys\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-textfile\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaecdab7-4807-4bae-baab-35520e039402-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-sys\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511345 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dspgx\" (UniqueName: \"kubernetes.io/projected/eef75401-5fc2-4705-9f86-365e393d4977-kube-api-access-dspgx\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:27.511369 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-root\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2p\" (UniqueName: \"kubernetes.io/projected/aaecdab7-4807-4bae-baab-35520e039402-kube-api-access-4js2p\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:27.511484 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls podName:eef75401-5fc2-4705-9f86-365e393d4977 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:28.011447939 +0000 UTC m=+176.700471021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls") pod "node-exporter-sbxm6" (UID: "eef75401-5fc2-4705-9f86-365e393d4977") : secret "node-exporter-tls" not found Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-wtmp\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-root\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.511816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-wtmp\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.512125 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-metrics-client-ca\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.512125 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.511867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-accelerators-collector-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.512125 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.512077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-textfile\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.512351 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.512331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aaecdab7-4807-4bae-baab-35520e039402-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.513734 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.513702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.513838 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.513778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.513838 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.513778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aaecdab7-4807-4bae-baab-35520e039402-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.527900 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.527878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dspgx\" (UniqueName: \"kubernetes.io/projected/eef75401-5fc2-4705-9f86-365e393d4977-kube-api-access-dspgx\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:27.528336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.528319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2p\" (UniqueName: \"kubernetes.io/projected/aaecdab7-4807-4bae-baab-35520e039402-kube-api-access-4js2p\") pod \"openshift-state-metrics-9d44df66c-4xctw\" (UID: \"aaecdab7-4807-4bae-baab-35520e039402\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.651811 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.651784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" Apr 22 19:26:27.766979 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:27.766949 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw"] Apr 22 19:26:27.769682 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:27.769655 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaecdab7_4807_4bae_baab_35520e039402.slice/crio-33004e9b1eb85c02da6dcd59cead31d9c5013ebfc4827e2704ccaa2f423ff2ef WatchSource:0}: Error finding container 33004e9b1eb85c02da6dcd59cead31d9c5013ebfc4827e2704ccaa2f423ff2ef: Status 404 returned error can't find the container with id 33004e9b1eb85c02da6dcd59cead31d9c5013ebfc4827e2704ccaa2f423ff2ef Apr 22 19:26:28.017058 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:28.016973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:28.017196 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:28.017126 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:26:28.017196 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:26:28.017193 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls podName:eef75401-5fc2-4705-9f86-365e393d4977 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:29.017176117 +0000 UTC m=+177.706199199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls") pod "node-exporter-sbxm6" (UID: "eef75401-5fc2-4705-9f86-365e393d4977") : secret "node-exporter-tls" not found Apr 22 19:26:28.407019 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:28.406982 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" event={"ID":"aaecdab7-4807-4bae-baab-35520e039402","Type":"ContainerStarted","Data":"1db1d7a08b55c96d92f978ca6d9f8681b9684784999e253765239a0a023f59d5"} Apr 22 19:26:28.407392 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:28.407043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" event={"ID":"aaecdab7-4807-4bae-baab-35520e039402","Type":"ContainerStarted","Data":"b9c79bda95fcdfdc1bc6bca94eef884723b3681185244ad8049d357bc1ee2719"} Apr 22 19:26:28.407392 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:28.407059 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" event={"ID":"aaecdab7-4807-4bae-baab-35520e039402","Type":"ContainerStarted","Data":"33004e9b1eb85c02da6dcd59cead31d9c5013ebfc4827e2704ccaa2f423ff2ef"} Apr 22 19:26:29.027998 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.027955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:29.030320 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.030289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eef75401-5fc2-4705-9f86-365e393d4977-node-exporter-tls\") pod \"node-exporter-sbxm6\" (UID: \"eef75401-5fc2-4705-9f86-365e393d4977\") " pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:29.178947 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.178915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sbxm6" Apr 22 19:26:29.188342 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:29.188315 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef75401_5fc2_4705_9f86_365e393d4977.slice/crio-3e14dec2e46a4d580d0b733007fc27b7284177788a1b97c8a4cc5ffbfe55b493 WatchSource:0}: Error finding container 3e14dec2e46a4d580d0b733007fc27b7284177788a1b97c8a4cc5ffbfe55b493: Status 404 returned error can't find the container with id 3e14dec2e46a4d580d0b733007fc27b7284177788a1b97c8a4cc5ffbfe55b493 Apr 22 19:26:29.295182 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.295145 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" containerID="cri-o://1502e39921155e432202f571e99da78f4c673d7c3b68a9a1c173c781a0c0e463" gracePeriod=30 Apr 22 19:26:29.301701 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.301680 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:29.301830 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.301729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:29.306570 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.306552 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:29.410482 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.410452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sbxm6" event={"ID":"eef75401-5fc2-4705-9f86-365e393d4977","Type":"ContainerStarted","Data":"3e14dec2e46a4d580d0b733007fc27b7284177788a1b97c8a4cc5ffbfe55b493"} Apr 22 19:26:29.412342 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.412318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" event={"ID":"aaecdab7-4807-4bae-baab-35520e039402","Type":"ContainerStarted","Data":"02f803045336d1d35635cf14435e411e013d7ba14847953f2bbc9ec08917b35c"} Apr 22 19:26:29.413662 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.413642 2574 generic.go:358] "Generic (PLEG): container finished" podID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerID="1502e39921155e432202f571e99da78f4c673d7c3b68a9a1c173c781a0c0e463" exitCode=0 Apr 22 19:26:29.413833 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.413809 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" event={"ID":"90a338b4-011e-41a1-9e5a-eeb91dc3dc21","Type":"ContainerDied","Data":"1502e39921155e432202f571e99da78f4c673d7c3b68a9a1c173c781a0c0e463"} Apr 22 19:26:29.418530 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.418504 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:26:29.431608 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.431533 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4xctw" podStartSLOduration=1.50379252 podStartE2EDuration="2.431521579s" podCreationTimestamp="2026-04-22 19:26:27 +0000 UTC" firstStartedPulling="2026-04-22 19:26:27.879315726 +0000 UTC m=+176.568338805" lastFinishedPulling="2026-04-22 19:26:28.807044785 +0000 UTC m=+177.496067864" observedRunningTime="2026-04-22 19:26:29.430770859 +0000 UTC m=+178.119793959" watchObservedRunningTime="2026-04-22 19:26:29.431521579 +0000 UTC m=+178.120544684" Apr 22 19:26:29.530274 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.530253 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:26:29.632631 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632592 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnvsd\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632691 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632890 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632811 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632890 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632862 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632996 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632904 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632996 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632935 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.632996 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.632984 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.633144 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.633061 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token\") pod \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\" (UID: \"90a338b4-011e-41a1-9e5a-eeb91dc3dc21\") " Apr 22 19:26:29.633346 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.633318 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:29.633944 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.633906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:29.635252 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.635202 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:29.635371 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.635343 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd" (OuterVolumeSpecName: "kube-api-access-cnvsd") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "kube-api-access-cnvsd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:29.635473 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.635428 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:29.635870 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.635846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:29.636125 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.636096 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:29.644135 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.644109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "90a338b4-011e-41a1-9e5a-eeb91dc3dc21" (UID: "90a338b4-011e-41a1-9e5a-eeb91dc3dc21"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:26:29.734446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734355 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-bound-sa-token\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734385 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnvsd\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-kube-api-access-cnvsd\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734396 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-tls\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734444 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-trusted-ca\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734806 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734460 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-installation-pull-secrets\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734806 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734474 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-registry-certificates\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734806 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734488 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-ca-trust-extracted\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:29.734806 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:29.734501 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/90a338b4-011e-41a1-9e5a-eeb91dc3dc21-image-registry-private-configuration\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:26:30.417497 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.417457 2574 generic.go:358] "Generic (PLEG): container finished" podID="eef75401-5fc2-4705-9f86-365e393d4977" containerID="249d77d72f18f69f8ca09d878428e28f6869583864ec06f96cf31abea846145f" exitCode=0 Apr 22 19:26:30.417968 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.417551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sbxm6" event={"ID":"eef75401-5fc2-4705-9f86-365e393d4977","Type":"ContainerDied","Data":"249d77d72f18f69f8ca09d878428e28f6869583864ec06f96cf31abea846145f"} Apr 22 19:26:30.418747 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.418711 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" Apr 22 19:26:30.418829 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.418775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-74cf6f6fcc-qk8ss" event={"ID":"90a338b4-011e-41a1-9e5a-eeb91dc3dc21","Type":"ContainerDied","Data":"3ff9890caa4f26b3d829a36dc20724470d615371dd1ac665247aba563558762c"} Apr 22 19:26:30.418829 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.418818 2574 scope.go:117] "RemoveContainer" containerID="1502e39921155e432202f571e99da78f4c673d7c3b68a9a1c173c781a0c0e463" Apr 22 19:26:30.455473 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.455445 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5549b6dc-7hqc7"] Apr 22 19:26:30.455861 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.455844 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" Apr 22 19:26:30.455979 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.455864 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" Apr 22 19:26:30.455979 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.455931 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" containerName="registry" Apr 22 19:26:30.458601 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.458582 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.461576 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.461553 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:26:30.461681 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.461623 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-p4v6h\"" Apr 22 19:26:30.461808 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.461790 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:26:30.461872 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.461864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:26:30.462128 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.462017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6faud3donhk9a\"" Apr 22 19:26:30.462128 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.462058 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:26:30.462274 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.462235 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:26:30.462540 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.462490 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:26:30.464667 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.464645 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-74cf6f6fcc-qk8ss"] Apr 22 19:26:30.471677 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.471640 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5549b6dc-7hqc7"] Apr 22 19:26:30.541778 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.541751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42375e8a-9780-410e-882b-f1a6d3d82976-metrics-client-ca\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542039 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wq7\" (UniqueName: \"kubernetes.io/projected/42375e8a-9780-410e-882b-f1a6d3d82976-kube-api-access-q8wq7\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542157 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542222 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542278 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542454 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-grpc-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542641 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.542778 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.542657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643621 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wq7\" (UniqueName: \"kubernetes.io/projected/42375e8a-9780-410e-882b-f1a6d3d82976-kube-api-access-q8wq7\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643621 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-grpc-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.643922 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.643795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42375e8a-9780-410e-882b-f1a6d3d82976-metrics-client-ca\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.644583 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.644553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42375e8a-9780-410e-882b-f1a6d3d82976-metrics-client-ca\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646433 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646527 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-grpc-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646670 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646789 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646871 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.646939 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.646924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42375e8a-9780-410e-882b-f1a6d3d82976-secret-thanos-querier-tls\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.652313 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.652291 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wq7\" (UniqueName: \"kubernetes.io/projected/42375e8a-9780-410e-882b-f1a6d3d82976-kube-api-access-q8wq7\") pod \"thanos-querier-5549b6dc-7hqc7\" (UID: \"42375e8a-9780-410e-882b-f1a6d3d82976\") " pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.768396 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.768310 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:30.887198 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:30.886989 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5549b6dc-7hqc7"] Apr 22 19:26:30.889413 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:30.889385 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42375e8a_9780_410e_882b_f1a6d3d82976.slice/crio-5cc0f678ff9b1ddb47a9ee6218bf441a7fa0d502c61a80b33af4561493810847 WatchSource:0}: Error finding container 5cc0f678ff9b1ddb47a9ee6218bf441a7fa0d502c61a80b33af4561493810847: Status 404 returned error can't find the container with id 5cc0f678ff9b1ddb47a9ee6218bf441a7fa0d502c61a80b33af4561493810847 Apr 22 19:26:31.424021 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:31.423986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"5cc0f678ff9b1ddb47a9ee6218bf441a7fa0d502c61a80b33af4561493810847"} Apr 22 19:26:31.426105 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:31.426075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sbxm6" event={"ID":"eef75401-5fc2-4705-9f86-365e393d4977","Type":"ContainerStarted","Data":"d5175cd1912f8d82a31d7323287a27bad29af65a17f455685f78565e6f7ffddd"} Apr 22 19:26:31.426221 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:31.426109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sbxm6" event={"ID":"eef75401-5fc2-4705-9f86-365e393d4977","Type":"ContainerStarted","Data":"d610d7cbf8edab0444ac6074a7d2837214f054b75023ed9b8a2a2e39ac6d56e7"} Apr 22 19:26:31.448147 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:31.448100 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sbxm6" podStartSLOduration=3.8259780389999998 podStartE2EDuration="4.448088087s" podCreationTimestamp="2026-04-22 19:26:27 +0000 UTC" firstStartedPulling="2026-04-22 19:26:29.190011705 +0000 UTC m=+177.879034787" lastFinishedPulling="2026-04-22 19:26:29.812121753 +0000 UTC m=+178.501144835" observedRunningTime="2026-04-22 19:26:31.446079881 +0000 UTC m=+180.135102982" watchObservedRunningTime="2026-04-22 19:26:31.448088087 +0000 UTC m=+180.137111190" Apr 22 19:26:31.865403 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:31.865368 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a338b4-011e-41a1-9e5a-eeb91dc3dc21" path="/var/lib/kubelet/pods/90a338b4-011e-41a1-9e5a-eeb91dc3dc21/volumes" Apr 22 19:26:32.132009 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.131921 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz"] Apr 22 19:26:32.134705 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.134684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:32.139365 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.139341 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-vvlxs\"" Apr 22 19:26:32.139959 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.139940 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:26:32.160739 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.160691 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz"] Apr 22 19:26:32.255474 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.255439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0d49f1-7eb1-47d3-adae-17b99f35fdec-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xn2fz\" (UID: \"bd0d49f1-7eb1-47d3-adae-17b99f35fdec\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:32.356732 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.356685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0d49f1-7eb1-47d3-adae-17b99f35fdec-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xn2fz\" (UID: \"bd0d49f1-7eb1-47d3-adae-17b99f35fdec\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:32.359497 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.359469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bd0d49f1-7eb1-47d3-adae-17b99f35fdec-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xn2fz\" (UID: \"bd0d49f1-7eb1-47d3-adae-17b99f35fdec\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:32.442926 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:32.442896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:33.030655 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:33.030506 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz"] Apr 22 19:26:33.033876 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:26:33.033844 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd0d49f1_7eb1_47d3_adae_17b99f35fdec.slice/crio-dddba17ef4b73560b5dc37689c6bc076ce27d686ef5be77043848e9c952500a2 WatchSource:0}: Error finding container dddba17ef4b73560b5dc37689c6bc076ce27d686ef5be77043848e9c952500a2: Status 404 returned error can't find the container with id dddba17ef4b73560b5dc37689c6bc076ce27d686ef5be77043848e9c952500a2 Apr 22 19:26:33.436404 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:33.436369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"b6bf6e6ffc02026db684c754168d9ebadec10d5766ae52fab61372ca77fb999d"} Apr 22 19:26:33.436574 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:33.436410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"596cb85aaa45aa23475a49e0f10bf5d6fd8307a83f78dfdbb11bb0b8c3c07c0b"} Apr 22 19:26:33.436574 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:33.436424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"c6c3467c908dc8ed5d396522fd50bcf236784a8f38697a4b3d55743ee71530b3"} Apr 22 19:26:33.439764 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:33.439732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" event={"ID":"bd0d49f1-7eb1-47d3-adae-17b99f35fdec","Type":"ContainerStarted","Data":"dddba17ef4b73560b5dc37689c6bc076ce27d686ef5be77043848e9c952500a2"} Apr 22 19:26:34.445300 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.445265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"4a6e59ae3263f414afd8bddda09bd335d054ca3785618d987232fac0667dc424"} Apr 22 19:26:34.445681 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.445307 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"a0686e35d62a8b8d43b7c2f6465e181ce6419d88789d8c0f895e2dc67c99022d"} Apr 22 19:26:34.445681 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.445321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" event={"ID":"42375e8a-9780-410e-882b-f1a6d3d82976","Type":"ContainerStarted","Data":"712dd3b4a1f268dfb8fccfd08fd938b7b55add6b17c439bf5cd7a568ed83a1a6"} Apr 22 19:26:34.445681 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.445432 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:34.446482 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.446463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" event={"ID":"bd0d49f1-7eb1-47d3-adae-17b99f35fdec","Type":"ContainerStarted","Data":"5bfd07fcc533fa14dc77e8f5b19f337f7145985b955d301bbb2d2e0e20253fef"} Apr 22 19:26:34.446631 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.446614 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:34.451074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.451056 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" Apr 22 19:26:34.472888 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.472836 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" podStartSLOduration=1.575558166 podStartE2EDuration="4.472825896s" podCreationTimestamp="2026-04-22 19:26:30 +0000 UTC" firstStartedPulling="2026-04-22 19:26:30.891269988 +0000 UTC m=+179.580293081" lastFinishedPulling="2026-04-22 19:26:33.788537718 +0000 UTC m=+182.477560811" observedRunningTime="2026-04-22 19:26:34.471449441 +0000 UTC m=+183.160472577" watchObservedRunningTime="2026-04-22 19:26:34.472825896 +0000 UTC m=+183.161848996" Apr 22 19:26:34.486857 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:34.486814 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xn2fz" podStartSLOduration=1.14922706 podStartE2EDuration="2.486805188s" podCreationTimestamp="2026-04-22 19:26:32 +0000 UTC" firstStartedPulling="2026-04-22 19:26:33.036070716 +0000 UTC m=+181.725093796" lastFinishedPulling="2026-04-22 19:26:34.373648842 +0000 UTC m=+183.062671924" observedRunningTime="2026-04-22 19:26:34.486347348 +0000 UTC m=+183.175370449" watchObservedRunningTime="2026-04-22 19:26:34.486805188 +0000 UTC m=+183.175828289" Apr 22 19:26:40.459077 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:40.459052 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5549b6dc-7hqc7" Apr 22 19:26:44.206939 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:44.206902 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:26:45.478512 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:45.478475 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6ad02c8-88c6-424e-8f61-266e366dcbba" containerID="47ec169b48d3c7bc174425eb16fca1951b208c0c704795d40075a93b5765964f" exitCode=0 Apr 22 19:26:45.478512 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:45.478516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" event={"ID":"f6ad02c8-88c6-424e-8f61-266e366dcbba","Type":"ContainerDied","Data":"47ec169b48d3c7bc174425eb16fca1951b208c0c704795d40075a93b5765964f"} Apr 22 19:26:45.478960 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:45.478835 2574 scope.go:117] "RemoveContainer" containerID="47ec169b48d3c7bc174425eb16fca1951b208c0c704795d40075a93b5765964f" Apr 22 19:26:46.482871 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:26:46.482832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-5zk6f" event={"ID":"f6ad02c8-88c6-424e-8f61-266e366dcbba","Type":"ContainerStarted","Data":"7b8b685a18d27cb60baa508add24a6a25c30fa469d0e4fa384ae8904cbf675e6"} Apr 22 19:27:00.523279 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:00.523243 2574 generic.go:358] "Generic (PLEG): container finished" podID="38aa17f7-ede3-4506-a743-892303b3d9b7" containerID="22adb9701ee424d8556054f8f79040dc46e44a4db148a6ad58b18cac83676d78" exitCode=0 Apr 22 19:27:00.523902 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:00.523323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" event={"ID":"38aa17f7-ede3-4506-a743-892303b3d9b7","Type":"ContainerDied","Data":"22adb9701ee424d8556054f8f79040dc46e44a4db148a6ad58b18cac83676d78"} Apr 22 19:27:00.523902 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:00.523764 2574 scope.go:117] "RemoveContainer" containerID="22adb9701ee424d8556054f8f79040dc46e44a4db148a6ad58b18cac83676d78" Apr 22 19:27:01.016735 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:01.016688 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbr82_d9acd301-c698-4dda-b95a-e48e9dfbf761/dns/0.log" Apr 22 19:27:01.022765 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:01.022740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbr82_d9acd301-c698-4dda-b95a-e48e9dfbf761/kube-rbac-proxy/0.log" Apr 22 19:27:01.173063 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:01.173035 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqg2m_3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2/dns-node-resolver/0.log" Apr 22 19:27:01.527997 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:01.527965 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dnrn8" event={"ID":"38aa17f7-ede3-4506-a743-892303b3d9b7","Type":"ContainerStarted","Data":"0ebcb23b05f0a8e582607ddce24f62b1cd8fc3eb027a839a4ece8f48c7f58a5d"} Apr 22 19:27:09.230035 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.229967 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8699647d9f-kf74x" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerName="console" containerID="cri-o://977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f" gracePeriod=15 Apr 22 19:27:09.481070 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.481017 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8699647d9f-kf74x_e393a8cd-0a87-4609-a470-1f8a4549d27b/console/0.log" Apr 22 19:27:09.481172 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.481080 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:27:09.556365 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556340 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8699647d9f-kf74x_e393a8cd-0a87-4609-a470-1f8a4549d27b/console/0.log" Apr 22 19:27:09.556528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556381 2574 generic.go:358] "Generic (PLEG): container finished" podID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerID="977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f" exitCode=2 Apr 22 19:27:09.556528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556444 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8699647d9f-kf74x" Apr 22 19:27:09.556528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556453 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8699647d9f-kf74x" event={"ID":"e393a8cd-0a87-4609-a470-1f8a4549d27b","Type":"ContainerDied","Data":"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f"} Apr 22 19:27:09.556528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556485 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8699647d9f-kf74x" event={"ID":"e393a8cd-0a87-4609-a470-1f8a4549d27b","Type":"ContainerDied","Data":"c9e82285f70ac3cd4583d3602f761afc15c42b852a45fc0365d7a9d4cafa32c9"} Apr 22 19:27:09.556528 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.556500 2574 scope.go:117] "RemoveContainer" containerID="977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f" Apr 22 19:27:09.563873 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.563855 2574 scope.go:117] "RemoveContainer" containerID="977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f" Apr 22 19:27:09.564134 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:27:09.564114 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f\": container with ID starting with 977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f not found: ID does not exist" containerID="977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f" Apr 22 19:27:09.564197 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.564146 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f"} err="failed to get container status \"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f\": rpc error: code = NotFound desc = could not find container \"977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f\": container with ID starting with 977751b48058739c52d35685ff4161b81597d85e45c7bc2e2364ad791055be0f not found: ID does not exist" Apr 22 19:27:09.565319 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nwzn\" (UniqueName: \"kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565360 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565336 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565400 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565362 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565400 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565380 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565467 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565424 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565467 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565450 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565557 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565490 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert\") pod \"e393a8cd-0a87-4609-a470-1f8a4549d27b\" (UID: \"e393a8cd-0a87-4609-a470-1f8a4549d27b\") " Apr 22 19:27:09.565956 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565928 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:09.565956 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565945 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config" (OuterVolumeSpecName: "console-config") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:09.566122 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.565983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:09.566122 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.566034 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca" (OuterVolumeSpecName: "service-ca") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:09.567599 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.567570 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:09.567768 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.567611 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:09.567768 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.567621 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn" (OuterVolumeSpecName: "kube-api-access-9nwzn") pod "e393a8cd-0a87-4609-a470-1f8a4549d27b" (UID: "e393a8cd-0a87-4609-a470-1f8a4549d27b"). InnerVolumeSpecName "kube-api-access-9nwzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:09.666406 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666371 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-trusted-ca-bundle\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666406 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666398 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666406 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666409 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666625 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666419 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9nwzn\" (UniqueName: \"kubernetes.io/projected/e393a8cd-0a87-4609-a470-1f8a4549d27b-kube-api-access-9nwzn\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666625 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666430 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e393a8cd-0a87-4609-a470-1f8a4549d27b-console-oauth-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666625 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666439 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-oauth-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.666625 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.666448 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e393a8cd-0a87-4609-a470-1f8a4549d27b-service-ca\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:27:09.877192 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.877160 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:27:09.880613 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:09.880593 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8699647d9f-kf74x"] Apr 22 19:27:10.414311 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:10.414269 2574 patch_prober.go:28] interesting pod/console-8699647d9f-kf74x container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.14:8443/health\": context deadline exceeded" start-of-body= Apr 22 19:27:10.414685 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:10.414355 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-8699647d9f-kf74x" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerName="console" probeResult="failure" output="Get \"https://10.134.0.14:8443/health\": context deadline exceeded" Apr 22 19:27:11.865814 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:11.865783 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" path="/var/lib/kubelet/pods/e393a8cd-0a87-4609-a470-1f8a4549d27b/volumes" Apr 22 19:27:43.739712 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:43.739618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:27:43.741961 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:43.741932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3f7c0766-21b4-4016-9a86-f022651a4b2e-metrics-certs\") pod \"network-metrics-daemon-rwxr2\" (UID: \"3f7c0766-21b4-4016-9a86-f022651a4b2e\") " pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:27:43.766478 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:43.766452 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ff584\"" Apr 22 19:27:43.774104 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:43.774079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwxr2" Apr 22 19:27:43.890019 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:43.889986 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwxr2"] Apr 22 19:27:43.892810 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:27:43.892777 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7c0766_21b4_4016_9a86_f022651a4b2e.slice/crio-554c5ede0c2dd119d4ff024070595542b7fa8d786d645221d53824095439e213 WatchSource:0}: Error finding container 554c5ede0c2dd119d4ff024070595542b7fa8d786d645221d53824095439e213: Status 404 returned error can't find the container with id 554c5ede0c2dd119d4ff024070595542b7fa8d786d645221d53824095439e213 Apr 22 19:27:44.651678 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:44.651634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwxr2" event={"ID":"3f7c0766-21b4-4016-9a86-f022651a4b2e","Type":"ContainerStarted","Data":"554c5ede0c2dd119d4ff024070595542b7fa8d786d645221d53824095439e213"} Apr 22 19:27:45.656226 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:45.656187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwxr2" event={"ID":"3f7c0766-21b4-4016-9a86-f022651a4b2e","Type":"ContainerStarted","Data":"005589bd8c3603625e01037e5360bccb0f655f32c3dae4be13f3400de6f7c0ac"} Apr 22 19:27:45.656605 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:45.656232 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwxr2" event={"ID":"3f7c0766-21b4-4016-9a86-f022651a4b2e","Type":"ContainerStarted","Data":"45db3547b0d6d2fb6f1634352fb818705104cd95d4e505925be3bdb25ca9d0f2"} Apr 22 19:27:45.673336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:45.673281 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rwxr2" podStartSLOduration=252.802523672 podStartE2EDuration="4m13.673263045s" podCreationTimestamp="2026-04-22 19:23:32 +0000 UTC" firstStartedPulling="2026-04-22 19:27:43.894507844 +0000 UTC m=+252.583530923" lastFinishedPulling="2026-04-22 19:27:44.765247218 +0000 UTC m=+253.454270296" observedRunningTime="2026-04-22 19:27:45.673014559 +0000 UTC m=+254.362037683" watchObservedRunningTime="2026-04-22 19:27:45.673263045 +0000 UTC m=+254.362286149" Apr 22 19:27:55.071388 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.071355 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:27:55.071774 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.071631 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerName="console" Apr 22 19:27:55.071774 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.071644 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerName="console" Apr 22 19:27:55.071774 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.071756 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e393a8cd-0a87-4609-a470-1f8a4549d27b" containerName="console" Apr 22 19:27:55.075678 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.075663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.078532 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.078512 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:27:55.078640 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.078512 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:27:55.078868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.078851 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-brsqm\"" Apr 22 19:27:55.079838 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.079822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:27:55.079910 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.079859 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:27:55.079910 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.079864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:27:55.080069 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.080051 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:27:55.080132 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.080086 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:27:55.085938 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.085915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:27:55.087095 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.087069 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:27:55.226451 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59rt\" (UniqueName: \"kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.226964 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.226675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h59rt\" (UniqueName: \"kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327531 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327891 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327891 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.327891 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.327597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.328381 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.328358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.328381 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.328371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.328503 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.328358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.328606 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.328582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.330003 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.329984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.330162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.330146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.336530 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.336507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59rt\" (UniqueName: \"kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt\") pod \"console-74859bf598-g5ppg\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.384417 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.384382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:27:55.501608 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.501584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:27:55.504058 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:27:55.504018 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49c43f3_128a_41c8_9bf2_74e8719087e2.slice/crio-9144821aa7e2dd31a6e6b15594e87c602407992f73deead743f00b7b1e90276b WatchSource:0}: Error finding container 9144821aa7e2dd31a6e6b15594e87c602407992f73deead743f00b7b1e90276b: Status 404 returned error can't find the container with id 9144821aa7e2dd31a6e6b15594e87c602407992f73deead743f00b7b1e90276b Apr 22 19:27:55.684583 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.684549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74859bf598-g5ppg" event={"ID":"d49c43f3-128a-41c8-9bf2-74e8719087e2","Type":"ContainerStarted","Data":"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a"} Apr 22 19:27:55.684583 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.684584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74859bf598-g5ppg" event={"ID":"d49c43f3-128a-41c8-9bf2-74e8719087e2","Type":"ContainerStarted","Data":"9144821aa7e2dd31a6e6b15594e87c602407992f73deead743f00b7b1e90276b"} Apr 22 19:27:55.704359 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:27:55.704305 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74859bf598-g5ppg" podStartSLOduration=0.704289726 podStartE2EDuration="704.289726ms" podCreationTimestamp="2026-04-22 19:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:55.703434933 +0000 UTC m=+264.392458045" watchObservedRunningTime="2026-04-22 19:27:55.704289726 +0000 UTC m=+264.393312827" Apr 22 19:28:05.384928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.384887 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:05.385391 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.384966 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:05.389737 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.389695 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:05.717199 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.717172 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:05.837465 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.837435 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:28:05.840424 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.840410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:05.852188 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:05.852163 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:28:06.006610 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006610 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006610 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006610 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006948 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006948 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.006948 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.006806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntn77\" (UniqueName: \"kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.107930 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.107887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntn77\" (UniqueName: \"kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.107943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.107967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.107984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108099 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108766 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108890 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.108890 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.109076 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.108893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.110455 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.110428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.110567 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.110521 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.115892 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.115874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntn77\" (UniqueName: \"kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77\") pod \"console-5476c9ff8c-qzl5x\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.149752 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.149703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:06.272506 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.272477 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:28:06.274252 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:28:06.274224 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59c8c87_b15e_48dc_bb43_7509a2f1ad5d.slice/crio-cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6 WatchSource:0}: Error finding container cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6: Status 404 returned error can't find the container with id cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6 Apr 22 19:28:06.717029 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.716993 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5476c9ff8c-qzl5x" event={"ID":"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d","Type":"ContainerStarted","Data":"1926ae5117df85fe948b4910f18247ba54c301d42b7168f2361790511678df0c"} Apr 22 19:28:06.717029 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.717029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5476c9ff8c-qzl5x" event={"ID":"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d","Type":"ContainerStarted","Data":"cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6"} Apr 22 19:28:06.737103 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:06.737050 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5476c9ff8c-qzl5x" podStartSLOduration=1.73703586 podStartE2EDuration="1.73703586s" podCreationTimestamp="2026-04-22 19:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:28:06.735690633 +0000 UTC m=+275.424713778" watchObservedRunningTime="2026-04-22 19:28:06.73703586 +0000 UTC m=+275.426058962" Apr 22 19:28:16.150275 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:16.150234 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:16.150799 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:16.150320 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:16.154915 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:16.154895 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:16.748050 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:16.748026 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:28:16.796857 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:16.796831 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:28:31.744952 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:31.744907 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:28:31.745434 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:31.745137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:28:31.748936 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:31.748906 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:28:41.815898 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:41.815839 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74859bf598-g5ppg" podUID="d49c43f3-128a-41c8-9bf2-74e8719087e2" containerName="console" containerID="cri-o://91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a" gracePeriod=15 Apr 22 19:28:42.044917 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.044893 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74859bf598-g5ppg_d49c43f3-128a-41c8-9bf2-74e8719087e2/console/0.log" Apr 22 19:28:42.045052 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.044978 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:42.068343 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068276 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068343 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068342 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h59rt\" (UniqueName: \"kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068551 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068373 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068551 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068409 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068551 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068431 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068551 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068455 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068551 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068504 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert\") pod \"d49c43f3-128a-41c8-9bf2-74e8719087e2\" (UID: \"d49c43f3-128a-41c8-9bf2-74e8719087e2\") " Apr 22 19:28:42.068809 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.068700 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca" (OuterVolumeSpecName: "service-ca") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:42.069223 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.069153 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config" (OuterVolumeSpecName: "console-config") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:42.069223 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.069189 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:42.069223 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.069160 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:42.071039 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.071017 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:42.071220 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.071202 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:42.071307 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.071287 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt" (OuterVolumeSpecName: "kube-api-access-h59rt") pod "d49c43f3-128a-41c8-9bf2-74e8719087e2" (UID: "d49c43f3-128a-41c8-9bf2-74e8719087e2"). InnerVolumeSpecName "kube-api-access-h59rt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:42.169927 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169893 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-oauth-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.169927 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169921 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-oauth-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.169927 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169932 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.170161 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169941 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-trusted-ca-bundle\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.170161 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169950 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d49c43f3-128a-41c8-9bf2-74e8719087e2-console-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.170161 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169960 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49c43f3-128a-41c8-9bf2-74e8719087e2-service-ca\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.170161 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.169968 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h59rt\" (UniqueName: \"kubernetes.io/projected/d49c43f3-128a-41c8-9bf2-74e8719087e2-kube-api-access-h59rt\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:28:42.820484 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820455 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74859bf598-g5ppg_d49c43f3-128a-41c8-9bf2-74e8719087e2/console/0.log" Apr 22 19:28:42.820914 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820496 2574 generic.go:358] "Generic (PLEG): container finished" podID="d49c43f3-128a-41c8-9bf2-74e8719087e2" containerID="91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a" exitCode=2 Apr 22 19:28:42.820914 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74859bf598-g5ppg" event={"ID":"d49c43f3-128a-41c8-9bf2-74e8719087e2","Type":"ContainerDied","Data":"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a"} Apr 22 19:28:42.820914 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74859bf598-g5ppg" Apr 22 19:28:42.820914 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820590 2574 scope.go:117] "RemoveContainer" containerID="91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a" Apr 22 19:28:42.820914 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.820577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74859bf598-g5ppg" event={"ID":"d49c43f3-128a-41c8-9bf2-74e8719087e2","Type":"ContainerDied","Data":"9144821aa7e2dd31a6e6b15594e87c602407992f73deead743f00b7b1e90276b"} Apr 22 19:28:42.828935 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.828910 2574 scope.go:117] "RemoveContainer" containerID="91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a" Apr 22 19:28:42.829207 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:28:42.829183 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a\": container with ID starting with 91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a not found: ID does not exist" containerID="91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a" Apr 22 19:28:42.829260 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.829219 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a"} err="failed to get container status \"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a\": rpc error: code = NotFound desc = could not find container \"91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a\": container with ID starting with 91b0dbdd55fcc288a12a6f709928b06f4507ad96fda43fa17b54990ecf6d8f6a not found: ID does not exist" Apr 22 19:28:42.842962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.842937 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:28:42.847443 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:42.847426 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74859bf598-g5ppg"] Apr 22 19:28:43.864241 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:28:43.864209 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d49c43f3-128a-41c8-9bf2-74e8719087e2" path="/var/lib/kubelet/pods/d49c43f3-128a-41c8-9bf2-74e8719087e2/volumes" Apr 22 19:33:02.240397 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.240363 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tcvq8"] Apr 22 19:33:02.240895 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.240624 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d49c43f3-128a-41c8-9bf2-74e8719087e2" containerName="console" Apr 22 19:33:02.240895 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.240634 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d49c43f3-128a-41c8-9bf2-74e8719087e2" containerName="console" Apr 22 19:33:02.240895 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.240690 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d49c43f3-128a-41c8-9bf2-74e8719087e2" containerName="console" Apr 22 19:33:02.243327 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.243312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.245993 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.245971 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:33:02.253352 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.253330 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tcvq8"] Apr 22 19:33:02.300579 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.300546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-kubelet-config\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.300579 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.300587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-dbus\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.300846 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.300693 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e2698c-b298-4ba6-bcf9-e1280547830a-original-pull-secret\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.401431 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.401399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e2698c-b298-4ba6-bcf9-e1280547830a-original-pull-secret\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.401564 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.401447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-kubelet-config\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.401564 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.401474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-dbus\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.401564 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.401542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-kubelet-config\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.401747 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.401596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a6e2698c-b298-4ba6-bcf9-e1280547830a-dbus\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.403766 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.403745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a6e2698c-b298-4ba6-bcf9-e1280547830a-original-pull-secret\") pod \"global-pull-secret-syncer-tcvq8\" (UID: \"a6e2698c-b298-4ba6-bcf9-e1280547830a\") " pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.552695 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.552592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tcvq8" Apr 22 19:33:02.673060 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.673030 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tcvq8"] Apr 22 19:33:02.675897 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:02.675863 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e2698c_b298_4ba6_bcf9_e1280547830a.slice/crio-2ac1940c83e231aafbfcaabb74e88ac08c93ecd7d319981c4dd91e3c7c4bee0c WatchSource:0}: Error finding container 2ac1940c83e231aafbfcaabb74e88ac08c93ecd7d319981c4dd91e3c7c4bee0c: Status 404 returned error can't find the container with id 2ac1940c83e231aafbfcaabb74e88ac08c93ecd7d319981c4dd91e3c7c4bee0c Apr 22 19:33:02.677452 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:02.677435 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:33:03.512938 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:03.512892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tcvq8" event={"ID":"a6e2698c-b298-4ba6-bcf9-e1280547830a","Type":"ContainerStarted","Data":"2ac1940c83e231aafbfcaabb74e88ac08c93ecd7d319981c4dd91e3c7c4bee0c"} Apr 22 19:33:07.525981 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:07.525943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tcvq8" event={"ID":"a6e2698c-b298-4ba6-bcf9-e1280547830a","Type":"ContainerStarted","Data":"85fad7fe35b82e2eaf45f32685356c13a3b7a3de99f0d4d64ef86b6eb29c5cf4"} Apr 22 19:33:07.559243 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:07.559204 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tcvq8" podStartSLOduration=1.739655753 podStartE2EDuration="5.559189889s" podCreationTimestamp="2026-04-22 19:33:02 +0000 UTC" firstStartedPulling="2026-04-22 19:33:02.677594133 +0000 UTC m=+571.366617212" lastFinishedPulling="2026-04-22 19:33:06.497128268 +0000 UTC m=+575.186151348" observedRunningTime="2026-04-22 19:33:07.55881662 +0000 UTC m=+576.247839732" watchObservedRunningTime="2026-04-22 19:33:07.559189889 +0000 UTC m=+576.248212989" Apr 22 19:33:11.290839 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:11.290806 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:33:21.578299 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.578261 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh"] Apr 22 19:33:21.581495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.581472 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.584397 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.584375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:33:21.584502 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.584462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:33:21.585728 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.585697 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:33:21.585816 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.585737 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:33:21.593067 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.593040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh"] Apr 22 19:33:21.645120 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.645093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnhl\" (UniqueName: \"kubernetes.io/projected/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-kube-api-access-jjnhl\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.645264 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.645130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-tmp\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.645264 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.645150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.746280 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.746248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnhl\" (UniqueName: \"kubernetes.io/projected/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-kube-api-access-jjnhl\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.746280 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.746284 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-tmp\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.746468 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.746314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.746658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.746637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-tmp\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.748755 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.748732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-klusterlet-config\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.754957 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.754938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnhl\" (UniqueName: \"kubernetes.io/projected/3009086f-4f80-4e29-add4-1cd6e1f8c8a8-kube-api-access-jjnhl\") pod \"klusterlet-addon-workmgr-766876458b-qs4xh\" (UID: \"3009086f-4f80-4e29-add4-1cd6e1f8c8a8\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.890115 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.890027 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:21.916102 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.916078 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf"] Apr 22 19:33:21.920524 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.920507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:21.923398 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.923376 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 19:33:21.923520 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.923438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 19:33:21.923574 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.923529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-s8vnb\"" Apr 22 19:33:21.930033 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.928746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf"] Apr 22 19:33:21.948126 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.948099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:21.948257 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.948134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:21.948257 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:21.948156 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxlh\" (UniqueName: \"kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.012779 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.012750 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh"] Apr 22 19:33:22.016638 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:22.016615 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3009086f_4f80_4e29_add4_1cd6e1f8c8a8.slice/crio-6bd9d14c457c521c918113c747a2cfb4bdf2446268e33235ea7d87b818ab4f2e WatchSource:0}: Error finding container 6bd9d14c457c521c918113c747a2cfb4bdf2446268e33235ea7d87b818ab4f2e: Status 404 returned error can't find the container with id 6bd9d14c457c521c918113c747a2cfb4bdf2446268e33235ea7d87b818ab4f2e Apr 22 19:33:22.049104 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.049079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.049217 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.049111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.049217 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.049138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxlh\" (UniqueName: \"kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.049480 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.049461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.049540 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.049493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.057170 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.057146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxlh\" (UniqueName: \"kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.232387 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.232357 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:22.346993 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.346955 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf"] Apr 22 19:33:22.349698 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:22.349666 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0411f940_d015_472d_af04_58777a545d26.slice/crio-e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22 WatchSource:0}: Error finding container e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22: Status 404 returned error can't find the container with id e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22 Apr 22 19:33:22.569007 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.568912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" event={"ID":"3009086f-4f80-4e29-add4-1cd6e1f8c8a8","Type":"ContainerStarted","Data":"6bd9d14c457c521c918113c747a2cfb4bdf2446268e33235ea7d87b818ab4f2e"} Apr 22 19:33:22.569844 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:22.569807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" event={"ID":"0411f940-d015-472d-af04-58777a545d26","Type":"ContainerStarted","Data":"e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22"} Apr 22 19:33:28.588878 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.588788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" event={"ID":"3009086f-4f80-4e29-add4-1cd6e1f8c8a8","Type":"ContainerStarted","Data":"abe0afefd736f1efec7f4775af9638c64bf071be24f4250fb46aafb31623be64"} Apr 22 19:33:28.589324 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.589020 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:28.590404 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.590374 2574 generic.go:358] "Generic (PLEG): container finished" podID="0411f940-d015-472d-af04-58777a545d26" containerID="e5529d936486b36609b3f438637471d2d56a13b7970716982e0faaa1e20e22e9" exitCode=0 Apr 22 19:33:28.590525 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.590399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" event={"ID":"0411f940-d015-472d-af04-58777a545d26","Type":"ContainerDied","Data":"e5529d936486b36609b3f438637471d2d56a13b7970716982e0faaa1e20e22e9"} Apr 22 19:33:28.590966 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.590945 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" Apr 22 19:33:28.605665 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:28.605618 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-766876458b-qs4xh" podStartSLOduration=1.3793337430000001 podStartE2EDuration="7.605605164s" podCreationTimestamp="2026-04-22 19:33:21 +0000 UTC" firstStartedPulling="2026-04-22 19:33:22.018426017 +0000 UTC m=+590.707449097" lastFinishedPulling="2026-04-22 19:33:28.244697439 +0000 UTC m=+596.933720518" observedRunningTime="2026-04-22 19:33:28.604882893 +0000 UTC m=+597.293905994" watchObservedRunningTime="2026-04-22 19:33:28.605605164 +0000 UTC m=+597.294628259" Apr 22 19:33:31.601332 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:31.601301 2574 generic.go:358] "Generic (PLEG): container finished" podID="0411f940-d015-472d-af04-58777a545d26" containerID="2f09e0170b912297703531dd5f6403a5d8ebb03333423259aeea0445842cf91b" exitCode=0 Apr 22 19:33:31.601849 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:31.601386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" event={"ID":"0411f940-d015-472d-af04-58777a545d26","Type":"ContainerDied","Data":"2f09e0170b912297703531dd5f6403a5d8ebb03333423259aeea0445842cf91b"} Apr 22 19:33:31.767703 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:31.767681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:33:31.767875 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:31.767681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:33:36.310034 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:36.309973 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5476c9ff8c-qzl5x" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerName="console" containerID="cri-o://1926ae5117df85fe948b4910f18247ba54c301d42b7168f2361790511678df0c" gracePeriod=15 Apr 22 19:33:36.744794 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:36.744756 2574 patch_prober.go:28] interesting pod/console-5476c9ff8c-qzl5x container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" start-of-body= Apr 22 19:33:36.744991 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:36.744828 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5476c9ff8c-qzl5x" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerName="console" probeResult="failure" output="Get \"https://10.134.0.19:8443/health\": dial tcp 10.134.0.19:8443: connect: connection refused" Apr 22 19:33:37.622446 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.622423 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5476c9ff8c-qzl5x_e59c8c87-b15e-48dc-bb43-7509a2f1ad5d/console/0.log" Apr 22 19:33:37.622791 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.622469 2574 generic.go:358] "Generic (PLEG): container finished" podID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerID="1926ae5117df85fe948b4910f18247ba54c301d42b7168f2361790511678df0c" exitCode=2 Apr 22 19:33:37.622791 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.622505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5476c9ff8c-qzl5x" event={"ID":"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d","Type":"ContainerDied","Data":"1926ae5117df85fe948b4910f18247ba54c301d42b7168f2361790511678df0c"} Apr 22 19:33:37.622791 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.622538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5476c9ff8c-qzl5x" event={"ID":"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d","Type":"ContainerDied","Data":"cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6"} Apr 22 19:33:37.622791 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.622549 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd702c2fae177ca4615c635ef3ac70144227865d9ac04792b655a7408d59f7f6" Apr 22 19:33:37.645070 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.645052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5476c9ff8c-qzl5x_e59c8c87-b15e-48dc-bb43-7509a2f1ad5d/console/0.log" Apr 22 19:33:37.645160 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.645113 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:33:37.777838 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777812 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.777962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777867 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntn77\" (UniqueName: \"kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.777962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777900 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.777962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777925 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.777962 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777954 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.778156 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.777988 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.778156 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.778019 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert\") pod \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\" (UID: \"e59c8c87-b15e-48dc-bb43-7509a2f1ad5d\") " Apr 22 19:33:37.778374 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.778313 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config" (OuterVolumeSpecName: "console-config") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:37.778556 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.778448 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca" (OuterVolumeSpecName: "service-ca") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:37.778556 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.778364 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:37.778556 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.778494 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:37.779898 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.779882 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:37.780443 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.780428 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:37.780485 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.780463 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77" (OuterVolumeSpecName: "kube-api-access-ntn77") pod "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" (UID: "e59c8c87-b15e-48dc-bb43-7509a2f1ad5d"). InnerVolumeSpecName "kube-api-access-ntn77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:37.879537 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879499 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879537 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879535 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-trusted-ca-bundle\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879537 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879545 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879714 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879554 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-service-ca\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879714 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879563 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-oauth-serving-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879714 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879571 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-console-oauth-config\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:37.879714 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:37.879579 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntn77\" (UniqueName: \"kubernetes.io/projected/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d-kube-api-access-ntn77\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:38.627275 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:38.627240 2574 generic.go:358] "Generic (PLEG): container finished" podID="0411f940-d015-472d-af04-58777a545d26" containerID="f4361ebae841bd92aa851d1bb34b0d40ceceb3a3babad9f0b35903528e579b5c" exitCode=0 Apr 22 19:33:38.627635 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:38.627323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" event={"ID":"0411f940-d015-472d-af04-58777a545d26","Type":"ContainerDied","Data":"f4361ebae841bd92aa851d1bb34b0d40ceceb3a3babad9f0b35903528e579b5c"} Apr 22 19:33:38.627635 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:38.627392 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5476c9ff8c-qzl5x" Apr 22 19:33:38.660628 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:38.660600 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:33:38.666088 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:38.666064 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5476c9ff8c-qzl5x"] Apr 22 19:33:39.747071 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.747051 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:39.865045 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.865013 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" path="/var/lib/kubelet/pods/e59c8c87-b15e-48dc-bb43-7509a2f1ad5d/volumes" Apr 22 19:33:39.895074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.895003 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle\") pod \"0411f940-d015-472d-af04-58777a545d26\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " Apr 22 19:33:39.895074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.895072 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxlh\" (UniqueName: \"kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh\") pod \"0411f940-d015-472d-af04-58777a545d26\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " Apr 22 19:33:39.895248 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.895112 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util\") pod \"0411f940-d015-472d-af04-58777a545d26\" (UID: \"0411f940-d015-472d-af04-58777a545d26\") " Apr 22 19:33:39.895595 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.895558 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle" (OuterVolumeSpecName: "bundle") pod "0411f940-d015-472d-af04-58777a545d26" (UID: "0411f940-d015-472d-af04-58777a545d26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:39.897285 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.897255 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh" (OuterVolumeSpecName: "kube-api-access-7rxlh") pod "0411f940-d015-472d-af04-58777a545d26" (UID: "0411f940-d015-472d-af04-58777a545d26"). InnerVolumeSpecName "kube-api-access-7rxlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:39.899980 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.899956 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util" (OuterVolumeSpecName: "util") pod "0411f940-d015-472d-af04-58777a545d26" (UID: "0411f940-d015-472d-af04-58777a545d26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:39.996419 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.996388 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-util\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:39.996419 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.996415 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0411f940-d015-472d-af04-58777a545d26-bundle\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:39.996419 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:39.996424 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rxlh\" (UniqueName: \"kubernetes.io/projected/0411f940-d015-472d-af04-58777a545d26-kube-api-access-7rxlh\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:33:40.639563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:40.639489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" event={"ID":"0411f940-d015-472d-af04-58777a545d26","Type":"ContainerDied","Data":"e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22"} Apr 22 19:33:40.639563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:40.639511 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8zjlf" Apr 22 19:33:40.639563 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:40.639526 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e25e47608c35c13cafd654d3ec5ae5961049ffde09450be7f0cbb22a73a78a22" Apr 22 19:33:43.906304 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906275 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w"] Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906537 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerName="console" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906547 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerName="console" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906562 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="extract" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906568 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="extract" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906574 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="pull" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906579 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="pull" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906586 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="util" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906591 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="util" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906636 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0411f940-d015-472d-af04-58777a545d26" containerName="extract" Apr 22 19:33:43.907596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.906645 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e59c8c87-b15e-48dc-bb43-7509a2f1ad5d" containerName="console" Apr 22 19:33:43.908161 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.908145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:43.911015 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.910991 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 19:33:43.911131 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.911001 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-g6vbt\"" Apr 22 19:33:43.911333 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.911320 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 19:33:43.911596 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.911583 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 19:33:43.921233 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:43.921212 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w"] Apr 22 19:33:44.026489 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.026441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm566\" (UniqueName: \"kubernetes.io/projected/920b3da4-2b68-46d4-b30d-8e2b5557edc0-kube-api-access-lm566\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.026671 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.026556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/920b3da4-2b68-46d4-b30d-8e2b5557edc0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.127616 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.127565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm566\" (UniqueName: \"kubernetes.io/projected/920b3da4-2b68-46d4-b30d-8e2b5557edc0-kube-api-access-lm566\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.127798 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.127643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/920b3da4-2b68-46d4-b30d-8e2b5557edc0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.129970 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.129939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/920b3da4-2b68-46d4-b30d-8e2b5557edc0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.136590 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.136565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm566\" (UniqueName: \"kubernetes.io/projected/920b3da4-2b68-46d4-b30d-8e2b5557edc0-kube-api-access-lm566\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5h72w\" (UID: \"920b3da4-2b68-46d4-b30d-8e2b5557edc0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.218357 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.218278 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:44.337162 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.337140 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w"] Apr 22 19:33:44.339326 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:44.339297 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod920b3da4_2b68_46d4_b30d_8e2b5557edc0.slice/crio-7ea079d7309c59eda1cfa88fabbbb380a7140d3fad222d1bc8787838f8697c42 WatchSource:0}: Error finding container 7ea079d7309c59eda1cfa88fabbbb380a7140d3fad222d1bc8787838f8697c42: Status 404 returned error can't find the container with id 7ea079d7309c59eda1cfa88fabbbb380a7140d3fad222d1bc8787838f8697c42 Apr 22 19:33:44.654780 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:44.654748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" event={"ID":"920b3da4-2b68-46d4-b30d-8e2b5557edc0","Type":"ContainerStarted","Data":"7ea079d7309c59eda1cfa88fabbbb380a7140d3fad222d1bc8787838f8697c42"} Apr 22 19:33:48.495892 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.495856 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-24mhg"] Apr 22 19:33:48.499331 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.499308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.502046 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.502016 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 19:33:48.502046 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.502017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 19:33:48.502212 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.502049 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8dlvs\"" Apr 22 19:33:48.508051 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.508028 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-24mhg"] Apr 22 19:33:48.665466 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.665421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/446c66c0-b6ae-464a-915e-d2d21c85f77a-cabundle0\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.665658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.665501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.665658 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.665548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ps8p\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-kube-api-access-8ps8p\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.669798 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.669769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" event={"ID":"920b3da4-2b68-46d4-b30d-8e2b5557edc0","Type":"ContainerStarted","Data":"1dc6bea3f6413832f4c96b445cb761d9f8f62364791f1fb3398f758e7b4d1bee"} Apr 22 19:33:48.669955 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.669861 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:33:48.696262 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.696205 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" podStartSLOduration=2.119786023 podStartE2EDuration="5.696188428s" podCreationTimestamp="2026-04-22 19:33:43 +0000 UTC" firstStartedPulling="2026-04-22 19:33:44.340989292 +0000 UTC m=+613.030012372" lastFinishedPulling="2026-04-22 19:33:47.91739169 +0000 UTC m=+616.606414777" observedRunningTime="2026-04-22 19:33:48.695767219 +0000 UTC m=+617.384790321" watchObservedRunningTime="2026-04-22 19:33:48.696188428 +0000 UTC m=+617.385211532" Apr 22 19:33:48.766662 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.766584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ps8p\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-kube-api-access-8ps8p\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.766662 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.766653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/446c66c0-b6ae-464a-915e-d2d21c85f77a-cabundle0\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.766928 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.766908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.767015 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.767001 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:33:48.767070 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.767020 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:33:48.767070 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.767034 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-24mhg: references non-existent secret key: ca.crt Apr 22 19:33:48.767147 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.767109 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates podName:446c66c0-b6ae-464a-915e-d2d21c85f77a nodeName:}" failed. No retries permitted until 2026-04-22 19:33:49.267087179 +0000 UTC m=+617.956110263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates") pod "keda-operator-ffbb595cb-24mhg" (UID: "446c66c0-b6ae-464a-915e-d2d21c85f77a") : references non-existent secret key: ca.crt Apr 22 19:33:48.767356 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.767337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/446c66c0-b6ae-464a-915e-d2d21c85f77a-cabundle0\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.777151 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.777128 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc"] Apr 22 19:33:48.780246 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.780232 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.783298 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.783277 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 19:33:48.785892 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.785876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ps8p\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-kube-api-access-8ps8p\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:48.790196 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.790174 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc"] Apr 22 19:33:48.868100 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.868070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4ks\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-kube-api-access-9v4ks\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.868243 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.868120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc9d02db-d451-420d-87f6-09680f4f2436-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.868286 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.868249 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.968817 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.968780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.969010 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.968830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4ks\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-kube-api-access-9v4ks\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.969010 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.968875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc9d02db-d451-420d-87f6-09680f4f2436-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.969010 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.968939 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:33:48.969010 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.968961 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:33:48.969010 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.968979 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc: references non-existent secret key: tls.crt Apr 22 19:33:48.969169 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:48.969045 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates podName:dc9d02db-d451-420d-87f6-09680f4f2436 nodeName:}" failed. No retries permitted until 2026-04-22 19:33:49.469030959 +0000 UTC m=+618.158054038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates") pod "keda-metrics-apiserver-7c9f485588-lsfcc" (UID: "dc9d02db-d451-420d-87f6-09680f4f2436") : references non-existent secret key: tls.crt Apr 22 19:33:48.969312 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.969296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/dc9d02db-d451-420d-87f6-09680f4f2436-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:48.979241 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:48.979208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4ks\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-kube-api-access-9v4ks\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:49.090578 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.090498 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-wlhfw"] Apr 22 19:33:49.093876 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.093859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.097150 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.097130 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 19:33:49.104936 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.104909 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wlhfw"] Apr 22 19:33:49.170998 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.170972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.171117 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.171036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttz4\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-kube-api-access-nttz4\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.272074 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.272042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.272198 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.272090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:49.272198 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.272125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nttz4\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-kube-api-access-nttz4\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.272270 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272184 2574 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 19:33:49.272270 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272219 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-wlhfw: secret "keda-admission-webhooks-certs" not found Apr 22 19:33:49.272270 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272246 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:33:49.272270 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272262 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:33:49.272270 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272272 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-24mhg: references non-existent secret key: ca.crt Apr 22 19:33:49.272442 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272276 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates podName:c464f856-90b6-4765-86f0-1ba832214fb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:33:49.772260449 +0000 UTC m=+618.461283527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates") pod "keda-admission-cf49989db-wlhfw" (UID: "c464f856-90b6-4765-86f0-1ba832214fb4") : secret "keda-admission-webhooks-certs" not found Apr 22 19:33:49.272442 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.272316 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates podName:446c66c0-b6ae-464a-915e-d2d21c85f77a nodeName:}" failed. No retries permitted until 2026-04-22 19:33:50.272300699 +0000 UTC m=+618.961323783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates") pod "keda-operator-ffbb595cb-24mhg" (UID: "446c66c0-b6ae-464a-915e-d2d21c85f77a") : references non-existent secret key: ca.crt Apr 22 19:33:49.281148 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.281122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttz4\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-kube-api-access-nttz4\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.473586 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.473552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:49.473772 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.473689 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:33:49.473772 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.473710 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:33:49.473772 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.473742 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc: references non-existent secret key: tls.crt Apr 22 19:33:49.473892 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:49.473801 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates podName:dc9d02db-d451-420d-87f6-09680f4f2436 nodeName:}" failed. No retries permitted until 2026-04-22 19:33:50.473786571 +0000 UTC m=+619.162809650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates") pod "keda-metrics-apiserver-7c9f485588-lsfcc" (UID: "dc9d02db-d451-420d-87f6-09680f4f2436") : references non-existent secret key: tls.crt Apr 22 19:33:49.776220 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.776130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:49.778569 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:49.778535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c464f856-90b6-4765-86f0-1ba832214fb4-certificates\") pod \"keda-admission-cf49989db-wlhfw\" (UID: \"c464f856-90b6-4765-86f0-1ba832214fb4\") " pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:50.006858 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:50.006823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:50.121495 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:50.121470 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wlhfw"] Apr 22 19:33:50.123871 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:50.123847 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc464f856_90b6_4765_86f0_1ba832214fb4.slice/crio-965f9c1af5b87dac034cbdde4c508bb5c618f8da4fb50d5cc6815650f805b389 WatchSource:0}: Error finding container 965f9c1af5b87dac034cbdde4c508bb5c618f8da4fb50d5cc6815650f805b389: Status 404 returned error can't find the container with id 965f9c1af5b87dac034cbdde4c508bb5c618f8da4fb50d5cc6815650f805b389 Apr 22 19:33:50.280477 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:50.280446 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:50.280627 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.280598 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:33:50.280627 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.280615 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:33:50.280627 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.280624 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-24mhg: references non-existent secret key: ca.crt Apr 22 19:33:50.280784 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.280674 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates podName:446c66c0-b6ae-464a-915e-d2d21c85f77a nodeName:}" failed. No retries permitted until 2026-04-22 19:33:52.280659316 +0000 UTC m=+620.969682400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates") pod "keda-operator-ffbb595cb-24mhg" (UID: "446c66c0-b6ae-464a-915e-d2d21c85f77a") : references non-existent secret key: ca.crt Apr 22 19:33:50.481880 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:50.481851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:50.482051 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.481985 2574 secret.go:281] references non-existent secret key: tls.crt Apr 22 19:33:50.482051 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.482000 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 19:33:50.482051 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.482018 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc: references non-existent secret key: tls.crt Apr 22 19:33:50.482151 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:50.482065 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates podName:dc9d02db-d451-420d-87f6-09680f4f2436 nodeName:}" failed. No retries permitted until 2026-04-22 19:33:52.482051519 +0000 UTC m=+621.171074598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates") pod "keda-metrics-apiserver-7c9f485588-lsfcc" (UID: "dc9d02db-d451-420d-87f6-09680f4f2436") : references non-existent secret key: tls.crt Apr 22 19:33:50.676896 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:50.676857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wlhfw" event={"ID":"c464f856-90b6-4765-86f0-1ba832214fb4","Type":"ContainerStarted","Data":"965f9c1af5b87dac034cbdde4c508bb5c618f8da4fb50d5cc6815650f805b389"} Apr 22 19:33:52.299490 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.299390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:52.299898 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:52.299572 2574 secret.go:281] references non-existent secret key: ca.crt Apr 22 19:33:52.299898 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:52.299597 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 19:33:52.299898 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:52.299610 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-24mhg: references non-existent secret key: ca.crt Apr 22 19:33:52.299898 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:33:52.299676 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates podName:446c66c0-b6ae-464a-915e-d2d21c85f77a nodeName:}" failed. No retries permitted until 2026-04-22 19:33:56.299656016 +0000 UTC m=+624.988679102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates") pod "keda-operator-ffbb595cb-24mhg" (UID: "446c66c0-b6ae-464a-915e-d2d21c85f77a") : references non-existent secret key: ca.crt Apr 22 19:33:52.501179 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.501129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:52.503690 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.503656 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dc9d02db-d451-420d-87f6-09680f4f2436-certificates\") pod \"keda-metrics-apiserver-7c9f485588-lsfcc\" (UID: \"dc9d02db-d451-420d-87f6-09680f4f2436\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:52.684706 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.684672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wlhfw" event={"ID":"c464f856-90b6-4765-86f0-1ba832214fb4","Type":"ContainerStarted","Data":"1c510feffcfba95b36fd332db7961b03e0ac3b45e03e86d0a01aa0de5d7833eb"} Apr 22 19:33:52.684875 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.684765 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:33:52.698852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.698828 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:52.703117 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.703075 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-wlhfw" podStartSLOduration=1.78495656 podStartE2EDuration="3.703060675s" podCreationTimestamp="2026-04-22 19:33:49 +0000 UTC" firstStartedPulling="2026-04-22 19:33:50.12556595 +0000 UTC m=+618.814589036" lastFinishedPulling="2026-04-22 19:33:52.043670072 +0000 UTC m=+620.732693151" observedRunningTime="2026-04-22 19:33:52.701207886 +0000 UTC m=+621.390230987" watchObservedRunningTime="2026-04-22 19:33:52.703060675 +0000 UTC m=+621.392083777" Apr 22 19:33:52.817653 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:52.817581 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc"] Apr 22 19:33:52.820334 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:52.820303 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9d02db_d451_420d_87f6_09680f4f2436.slice/crio-cae5fa4b65011cac04ee2226c4d9a0f055a875cb9e735caa82760a8a92c55fda WatchSource:0}: Error finding container cae5fa4b65011cac04ee2226c4d9a0f055a875cb9e735caa82760a8a92c55fda: Status 404 returned error can't find the container with id cae5fa4b65011cac04ee2226c4d9a0f055a875cb9e735caa82760a8a92c55fda Apr 22 19:33:53.689077 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:53.689044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" event={"ID":"dc9d02db-d451-420d-87f6-09680f4f2436","Type":"ContainerStarted","Data":"cae5fa4b65011cac04ee2226c4d9a0f055a875cb9e735caa82760a8a92c55fda"} Apr 22 19:33:56.335875 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.335840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:56.338239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.338220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/446c66c0-b6ae-464a-915e-d2d21c85f77a-certificates\") pod \"keda-operator-ffbb595cb-24mhg\" (UID: \"446c66c0-b6ae-464a-915e-d2d21c85f77a\") " pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:56.609743 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.609635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:33:56.702973 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.702942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" event={"ID":"dc9d02db-d451-420d-87f6-09680f4f2436","Type":"ContainerStarted","Data":"5641f7057cee5ed21e3b9eded194cb5a3408b5cdfc34e8ffa41ae0cfe70a52b3"} Apr 22 19:33:56.703133 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.703078 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:33:56.719572 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.719528 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" podStartSLOduration=5.90230043 podStartE2EDuration="8.719513368s" podCreationTimestamp="2026-04-22 19:33:48 +0000 UTC" firstStartedPulling="2026-04-22 19:33:52.821647241 +0000 UTC m=+621.510670320" lastFinishedPulling="2026-04-22 19:33:55.638860178 +0000 UTC m=+624.327883258" observedRunningTime="2026-04-22 19:33:56.719202763 +0000 UTC m=+625.408225875" watchObservedRunningTime="2026-04-22 19:33:56.719513368 +0000 UTC m=+625.408536672" Apr 22 19:33:56.724866 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:56.724847 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-24mhg"] Apr 22 19:33:56.726923 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:33:56.726898 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod446c66c0_b6ae_464a_915e_d2d21c85f77a.slice/crio-59604f3c1bc920f26c2cc4edd7f660f8be5c354ccce18a1642d1e1e86328586d WatchSource:0}: Error finding container 59604f3c1bc920f26c2cc4edd7f660f8be5c354ccce18a1642d1e1e86328586d: Status 404 returned error can't find the container with id 59604f3c1bc920f26c2cc4edd7f660f8be5c354ccce18a1642d1e1e86328586d Apr 22 19:33:57.707540 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:33:57.707499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" event={"ID":"446c66c0-b6ae-464a-915e-d2d21c85f77a","Type":"ContainerStarted","Data":"59604f3c1bc920f26c2cc4edd7f660f8be5c354ccce18a1642d1e1e86328586d"} Apr 22 19:34:00.718512 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:00.718448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" event={"ID":"446c66c0-b6ae-464a-915e-d2d21c85f77a","Type":"ContainerStarted","Data":"6af642281500dffc6e5f82af3ea7032ea40bfb1a9f95e275173d99707da1ef63"} Apr 22 19:34:00.719036 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:00.718544 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:34:07.712422 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:07.712391 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-lsfcc" Apr 22 19:34:07.751872 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:07.751816 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" podStartSLOduration=16.095480602 podStartE2EDuration="19.751801028s" podCreationTimestamp="2026-04-22 19:33:48 +0000 UTC" firstStartedPulling="2026-04-22 19:33:56.728200402 +0000 UTC m=+625.417223485" lastFinishedPulling="2026-04-22 19:34:00.384520828 +0000 UTC m=+629.073543911" observedRunningTime="2026-04-22 19:34:00.749402842 +0000 UTC m=+629.438425954" watchObservedRunningTime="2026-04-22 19:34:07.751801028 +0000 UTC m=+636.440824132" Apr 22 19:34:09.675178 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:09.675142 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5h72w" Apr 22 19:34:13.691850 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:13.691816 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-wlhfw" Apr 22 19:34:21.725872 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:21.725837 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-24mhg" Apr 22 19:34:31.777401 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:31.777371 2574 scope.go:117] "RemoveContainer" containerID="1926ae5117df85fe948b4910f18247ba54c301d42b7168f2361790511678df0c" Apr 22 19:34:56.088647 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.088604 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:34:56.091993 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.091971 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.094676 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.094656 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 19:34:56.095119 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.095095 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 19:34:56.095226 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.095161 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-mp48v\"" Apr 22 19:34:56.095336 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.095321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 19:34:56.102534 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.102514 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:34:56.114656 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.114630 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n"] Apr 22 19:34:56.118072 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.118053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.121639 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.121618 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 19:34:56.121773 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.121646 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2gnq2\"" Apr 22 19:34:56.128031 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.128012 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n"] Apr 22 19:34:56.133576 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.133552 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-kbxgc"] Apr 22 19:34:56.136425 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.136406 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.139301 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.139281 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-c5d95\"" Apr 22 19:34:56.139384 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.139287 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 19:34:56.145248 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.145225 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kbxgc"] Apr 22 19:34:56.224293 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.224260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wz5n\" (UniqueName: \"kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.224293 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.224297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.224514 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.224352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.224514 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.224383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjwf\" (UniqueName: \"kubernetes.io/projected/cf162d4e-ec56-421a-ac73-a69be845984a-kube-api-access-4sjwf\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.325854 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.325822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wz5n\" (UniqueName: \"kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.325854 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.325858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.326106 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.325901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4ab439ba-7881-4aab-acc5-f8cb88d19978-data\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.326106 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.325922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnnr\" (UniqueName: \"kubernetes.io/projected/4ab439ba-7881-4aab-acc5-f8cb88d19978-kube-api-access-zhnnr\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.326106 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.325940 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.326106 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.326007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjwf\" (UniqueName: \"kubernetes.io/projected/cf162d4e-ec56-421a-ac73-a69be845984a-kube-api-access-4sjwf\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.326106 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:34:56.326041 2574 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 19:34:56.326378 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:34:56.326112 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert podName:cf162d4e-ec56-421a-ac73-a69be845984a nodeName:}" failed. No retries permitted until 2026-04-22 19:34:56.826089637 +0000 UTC m=+685.515112715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert") pod "llmisvc-controller-manager-68cc5db7c4-kcz2n" (UID: "cf162d4e-ec56-421a-ac73-a69be845984a") : secret "llmisvc-webhook-server-cert" not found Apr 22 19:34:56.328337 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.328314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.334883 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.334860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wz5n\" (UniqueName: \"kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n\") pod \"kserve-controller-manager-545d8995fb-6p62v\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.334996 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.334946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjwf\" (UniqueName: \"kubernetes.io/projected/cf162d4e-ec56-421a-ac73-a69be845984a-kube-api-access-4sjwf\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.402862 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.402841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:34:56.426868 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.426837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4ab439ba-7881-4aab-acc5-f8cb88d19978-data\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.427025 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.426876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnnr\" (UniqueName: \"kubernetes.io/projected/4ab439ba-7881-4aab-acc5-f8cb88d19978-kube-api-access-zhnnr\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.427247 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.427225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4ab439ba-7881-4aab-acc5-f8cb88d19978-data\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.435969 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.435942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnnr\" (UniqueName: \"kubernetes.io/projected/4ab439ba-7881-4aab-acc5-f8cb88d19978-kube-api-access-zhnnr\") pod \"seaweedfs-86cc847c5c-kbxgc\" (UID: \"4ab439ba-7881-4aab-acc5-f8cb88d19978\") " pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.446790 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.446551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:34:56.529055 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.529021 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:34:56.532048 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:34:56.532018 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28fe80d7_7b22_4d94_a5a4_5b4e2c045fdc.slice/crio-f9310bd46dc5da03799e1824048e76a31f6951f00ca9dd7a86293fb0d3cdd9b1 WatchSource:0}: Error finding container f9310bd46dc5da03799e1824048e76a31f6951f00ca9dd7a86293fb0d3cdd9b1: Status 404 returned error can't find the container with id f9310bd46dc5da03799e1824048e76a31f6951f00ca9dd7a86293fb0d3cdd9b1 Apr 22 19:34:56.576060 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.576038 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-kbxgc"] Apr 22 19:34:56.577526 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:34:56.577495 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ab439ba_7881_4aab_acc5_f8cb88d19978.slice/crio-44c8e7a6bd1a7f9f9c4e0e08874bb6a5a9c24c363bd1a176d7193c79accc6fb7 WatchSource:0}: Error finding container 44c8e7a6bd1a7f9f9c4e0e08874bb6a5a9c24c363bd1a176d7193c79accc6fb7: Status 404 returned error can't find the container with id 44c8e7a6bd1a7f9f9c4e0e08874bb6a5a9c24c363bd1a176d7193c79accc6fb7 Apr 22 19:34:56.830734 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.830630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.832946 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.832920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf162d4e-ec56-421a-ac73-a69be845984a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-kcz2n\" (UID: \"cf162d4e-ec56-421a-ac73-a69be845984a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:56.903850 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.903812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kbxgc" event={"ID":"4ab439ba-7881-4aab-acc5-f8cb88d19978","Type":"ContainerStarted","Data":"44c8e7a6bd1a7f9f9c4e0e08874bb6a5a9c24c363bd1a176d7193c79accc6fb7"} Apr 22 19:34:56.904770 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:56.904743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" event={"ID":"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc","Type":"ContainerStarted","Data":"f9310bd46dc5da03799e1824048e76a31f6951f00ca9dd7a86293fb0d3cdd9b1"} Apr 22 19:34:57.028978 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:57.028943 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:34:57.175463 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:57.175432 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n"] Apr 22 19:34:57.177555 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:34:57.177524 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcf162d4e_ec56_421a_ac73_a69be845984a.slice/crio-156b5c1c6e7f862c5aedd3a01d49035c0f191c2e4d2172c2e60f4ac1e4dcf432 WatchSource:0}: Error finding container 156b5c1c6e7f862c5aedd3a01d49035c0f191c2e4d2172c2e60f4ac1e4dcf432: Status 404 returned error can't find the container with id 156b5c1c6e7f862c5aedd3a01d49035c0f191c2e4d2172c2e60f4ac1e4dcf432 Apr 22 19:34:57.910788 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:34:57.910741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" event={"ID":"cf162d4e-ec56-421a-ac73-a69be845984a","Type":"ContainerStarted","Data":"156b5c1c6e7f862c5aedd3a01d49035c0f191c2e4d2172c2e60f4ac1e4dcf432"} Apr 22 19:35:01.926775 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.926741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-kbxgc" event={"ID":"4ab439ba-7881-4aab-acc5-f8cb88d19978","Type":"ContainerStarted","Data":"cb7438a26e6efa067cf898be1dfdb3504103459ad7d67a2fa607382f81663ee3"} Apr 22 19:35:01.927283 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.926881 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:35:01.928239 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.928206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" event={"ID":"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc","Type":"ContainerStarted","Data":"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79"} Apr 22 19:35:01.928629 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.928607 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:35:01.929780 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.929757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" event={"ID":"cf162d4e-ec56-421a-ac73-a69be845984a","Type":"ContainerStarted","Data":"907012726a2f59c4c0ac61b27ea88b1999429d57f30f869a0037ba82819050d3"} Apr 22 19:35:01.929877 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.929864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:35:01.947263 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.947202 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-kbxgc" podStartSLOduration=1.123283721 podStartE2EDuration="5.947186276s" podCreationTimestamp="2026-04-22 19:34:56 +0000 UTC" firstStartedPulling="2026-04-22 19:34:56.578893582 +0000 UTC m=+685.267916662" lastFinishedPulling="2026-04-22 19:35:01.402796119 +0000 UTC m=+690.091819217" observedRunningTime="2026-04-22 19:35:01.945504041 +0000 UTC m=+690.634527143" watchObservedRunningTime="2026-04-22 19:35:01.947186276 +0000 UTC m=+690.636209377" Apr 22 19:35:01.962966 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.962909 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" podStartSLOduration=1.793507014 podStartE2EDuration="5.962893176s" podCreationTimestamp="2026-04-22 19:34:56 +0000 UTC" firstStartedPulling="2026-04-22 19:34:57.179059774 +0000 UTC m=+685.868082858" lastFinishedPulling="2026-04-22 19:35:01.348445931 +0000 UTC m=+690.037469020" observedRunningTime="2026-04-22 19:35:01.960980267 +0000 UTC m=+690.650003367" watchObservedRunningTime="2026-04-22 19:35:01.962893176 +0000 UTC m=+690.651916277" Apr 22 19:35:01.978699 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:01.978646 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" podStartSLOduration=2.025560801 podStartE2EDuration="5.978630434s" podCreationTimestamp="2026-04-22 19:34:56 +0000 UTC" firstStartedPulling="2026-04-22 19:34:56.533307059 +0000 UTC m=+685.222330142" lastFinishedPulling="2026-04-22 19:35:00.486376681 +0000 UTC m=+689.175399775" observedRunningTime="2026-04-22 19:35:01.977621293 +0000 UTC m=+690.666644405" watchObservedRunningTime="2026-04-22 19:35:01.978630434 +0000 UTC m=+690.667653535" Apr 22 19:35:07.935805 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:07.935775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-kbxgc" Apr 22 19:35:32.935553 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:32.935519 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-kcz2n" Apr 22 19:35:33.940954 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:33.940925 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:35:34.191133 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.191049 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:35:34.191275 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.191240 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" podUID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" containerName="manager" containerID="cri-o://4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79" gracePeriod=10 Apr 22 19:35:34.227964 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.227932 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-d6jd2"] Apr 22 19:35:34.280025 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.280001 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-d6jd2"] Apr 22 19:35:34.280137 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.280126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.435415 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.435382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c00196bd-d0de-4482-b0ad-6948288211b3-cert\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.435565 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.435419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wh8w\" (UniqueName: \"kubernetes.io/projected/c00196bd-d0de-4482-b0ad-6948288211b3-kube-api-access-4wh8w\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.451899 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.451847 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:35:34.535941 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.535911 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert\") pod \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " Apr 22 19:35:34.536103 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.535996 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wz5n\" (UniqueName: \"kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n\") pod \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\" (UID: \"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc\") " Apr 22 19:35:34.536155 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.536117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c00196bd-d0de-4482-b0ad-6948288211b3-cert\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.536155 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.536139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wh8w\" (UniqueName: \"kubernetes.io/projected/c00196bd-d0de-4482-b0ad-6948288211b3-kube-api-access-4wh8w\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.538169 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.538138 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n" (OuterVolumeSpecName: "kube-api-access-5wz5n") pod "28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" (UID: "28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc"). InnerVolumeSpecName "kube-api-access-5wz5n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:34.538169 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.538142 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert" (OuterVolumeSpecName: "cert") pod "28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" (UID: "28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:34.538425 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.538406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c00196bd-d0de-4482-b0ad-6948288211b3-cert\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.546943 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.546917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wh8w\" (UniqueName: \"kubernetes.io/projected/c00196bd-d0de-4482-b0ad-6948288211b3-kube-api-access-4wh8w\") pod \"kserve-controller-manager-545d8995fb-d6jd2\" (UID: \"c00196bd-d0de-4482-b0ad-6948288211b3\") " pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.637264 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.637228 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5wz5n\" (UniqueName: \"kubernetes.io/projected/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-kube-api-access-5wz5n\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:35:34.637264 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.637258 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc-cert\") on node \"ip-10-0-135-144.ec2.internal\" DevicePath \"\"" Apr 22 19:35:34.650076 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.650046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:34.769822 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:34.769794 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-d6jd2"] Apr 22 19:35:34.771566 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:35:34.771535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00196bd_d0de_4482_b0ad_6948288211b3.slice/crio-3d6c7af885d7c03a84931c75414130acda60d68ad61e4ba246a08eed873fb5c4 WatchSource:0}: Error finding container 3d6c7af885d7c03a84931c75414130acda60d68ad61e4ba246a08eed873fb5c4: Status 404 returned error can't find the container with id 3d6c7af885d7c03a84931c75414130acda60d68ad61e4ba246a08eed873fb5c4 Apr 22 19:35:35.040338 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.040254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" event={"ID":"c00196bd-d0de-4482-b0ad-6948288211b3","Type":"ContainerStarted","Data":"3d6c7af885d7c03a84931c75414130acda60d68ad61e4ba246a08eed873fb5c4"} Apr 22 19:35:35.041353 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.041326 2574 generic.go:358] "Generic (PLEG): container finished" podID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" containerID="4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79" exitCode=0 Apr 22 19:35:35.041476 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.041386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" event={"ID":"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc","Type":"ContainerDied","Data":"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79"} Apr 22 19:35:35.041476 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.041395 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" Apr 22 19:35:35.041476 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.041417 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-6p62v" event={"ID":"28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc","Type":"ContainerDied","Data":"f9310bd46dc5da03799e1824048e76a31f6951f00ca9dd7a86293fb0d3cdd9b1"} Apr 22 19:35:35.041476 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.041434 2574 scope.go:117] "RemoveContainer" containerID="4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79" Apr 22 19:35:35.049259 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.049245 2574 scope.go:117] "RemoveContainer" containerID="4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79" Apr 22 19:35:35.049507 ip-10-0-135-144 kubenswrapper[2574]: E0422 19:35:35.049484 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79\": container with ID starting with 4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79 not found: ID does not exist" containerID="4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79" Apr 22 19:35:35.049548 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.049523 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79"} err="failed to get container status \"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79\": rpc error: code = NotFound desc = could not find container \"4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79\": container with ID starting with 4277d132272e496eac4ddef581f7ce176dce587069ff974ca2493d512df8da79 not found: ID does not exist" Apr 22 19:35:35.067700 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.067676 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:35:35.071785 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.071765 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-545d8995fb-6p62v"] Apr 22 19:35:35.865823 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:35.865793 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" path="/var/lib/kubelet/pods/28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc/volumes" Apr 22 19:35:36.045588 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:36.045551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" event={"ID":"c00196bd-d0de-4482-b0ad-6948288211b3","Type":"ContainerStarted","Data":"b0b9343208c7e71adadf3df08ec2d6319a02fb759f4f3390d22ba082076cabcd"} Apr 22 19:35:36.046054 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:36.045616 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:35:36.081776 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:35:36.081703 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" podStartSLOduration=1.658334504 podStartE2EDuration="2.08168756s" podCreationTimestamp="2026-04-22 19:35:34 +0000 UTC" firstStartedPulling="2026-04-22 19:35:34.772909474 +0000 UTC m=+723.461932554" lastFinishedPulling="2026-04-22 19:35:35.196262529 +0000 UTC m=+723.885285610" observedRunningTime="2026-04-22 19:35:36.080572639 +0000 UTC m=+724.769595739" watchObservedRunningTime="2026-04-22 19:35:36.08168756 +0000 UTC m=+724.770710660" Apr 22 19:36:07.055414 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:36:07.055381 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-545d8995fb-d6jd2" Apr 22 19:37:08.951858 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.951817 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5"] Apr 22 19:37:08.952422 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.952311 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" containerName="manager" Apr 22 19:37:08.952422 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.952330 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" containerName="manager" Apr 22 19:37:08.952542 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.952429 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="28fe80d7-7b22-4d94-a5a4-5b4e2c045fdc" containerName="manager" Apr 22 19:37:08.955401 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.955379 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:08.958013 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.957993 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 22 19:37:08.958121 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.958028 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 19:37:08.961991 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:08.961969 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5"] Apr 22 19:37:09.006244 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.006213 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6kc\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-kube-api-access-zm6kc\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.006423 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.006285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.006423 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.006335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/084ba333-869f-4315-a656-2e9b1fbf99c7-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.106889 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.106839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/084ba333-869f-4315-a656-2e9b1fbf99c7-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.106889 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.106899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6kc\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-kube-api-access-zm6kc\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.107152 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.106964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.107301 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.107281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/084ba333-869f-4315-a656-2e9b1fbf99c7-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.109417 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.109394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.114852 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.114827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6kc\" (UniqueName: \"kubernetes.io/projected/084ba333-869f-4315-a656-2e9b1fbf99c7-kube-api-access-zm6kc\") pod \"seaweedfs-tls-custom-5c88b85bb7-b85g5\" (UID: \"084ba333-869f-4315-a656-2e9b1fbf99c7\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.265483 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.265400 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" Apr 22 19:37:09.384189 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:09.384158 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5"] Apr 22 19:37:09.386703 ip-10-0-135-144 kubenswrapper[2574]: W0422 19:37:09.386665 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod084ba333_869f_4315_a656_2e9b1fbf99c7.slice/crio-baf098dfcace3e3dbcada8e411a183ad8d43c8a2cc0756aafdb42fdbccd94b11 WatchSource:0}: Error finding container baf098dfcace3e3dbcada8e411a183ad8d43c8a2cc0756aafdb42fdbccd94b11: Status 404 returned error can't find the container with id baf098dfcace3e3dbcada8e411a183ad8d43c8a2cc0756aafdb42fdbccd94b11 Apr 22 19:37:10.349338 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:10.349274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" event={"ID":"084ba333-869f-4315-a656-2e9b1fbf99c7","Type":"ContainerStarted","Data":"28020bb7ded260d5564ad3e6fad66f66780e65b4ce1ee77c0f33a7f27377bf23"} Apr 22 19:37:10.349338 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:10.349335 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" event={"ID":"084ba333-869f-4315-a656-2e9b1fbf99c7","Type":"ContainerStarted","Data":"baf098dfcace3e3dbcada8e411a183ad8d43c8a2cc0756aafdb42fdbccd94b11"} Apr 22 19:37:10.367214 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:37:10.367166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-b85g5" podStartSLOduration=2.121840076 podStartE2EDuration="2.367151658s" podCreationTimestamp="2026-04-22 19:37:08 +0000 UTC" firstStartedPulling="2026-04-22 19:37:09.387898895 +0000 UTC m=+818.076921974" lastFinishedPulling="2026-04-22 19:37:09.633210476 +0000 UTC m=+818.322233556" observedRunningTime="2026-04-22 19:37:10.364604199 +0000 UTC m=+819.053627301" watchObservedRunningTime="2026-04-22 19:37:10.367151658 +0000 UTC m=+819.056174759" Apr 22 19:38:31.789375 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:38:31.789350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:38:31.790681 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:38:31.790661 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:43:31.810946 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:43:31.810919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:43:31.813608 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:43:31.813588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:48:31.832704 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:48:31.832671 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:48:31.836414 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:48:31.836390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:53:31.856219 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:53:31.856191 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:53:31.859985 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:53:31.859966 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:58:31.877460 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:58:31.877433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 19:58:31.882693 ip-10-0-135-144 kubenswrapper[2574]: I0422 19:58:31.882674 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:03:31.904567 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:03:31.904531 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:03:31.910815 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:03:31.910794 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:08:31.928420 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:08:31.928344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:08:31.933803 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:08:31.933780 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:13:31.950300 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:13:31.950270 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:13:31.956304 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:13:31.956277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:18:31.971604 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:18:31.971571 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:18:31.978393 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:18:31.978357 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:23:31.993148 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:23:31.993116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:23:32.000460 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:23:32.000441 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:25:54.514828 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:54.514796 2574 ???:1] "http: TLS handshake error from 10.0.141.16:60518: EOF" Apr 22 20:25:54.522136 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:54.522114 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tcvq8_a6e2698c-b298-4ba6-bcf9-e1280547830a/global-pull-secret-syncer/0.log" Apr 22 20:25:54.584667 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:54.584631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-27hl5_027e0f5b-b1d5-4ef4-a370-2ba4520f5d94/konnectivity-agent/0.log" Apr 22 20:25:54.789432 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:54.789360 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-144.ec2.internal_fab854a3125f200d91109b3a7636112a/haproxy/0.log" Apr 22 20:25:58.512025 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.511992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xn2fz_bd0d49f1-7eb1-47d3-adae-17b99f35fdec/monitoring-plugin/0.log" Apr 22 20:25:58.692483 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.692457 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sbxm6_eef75401-5fc2-4705-9f86-365e393d4977/node-exporter/0.log" Apr 22 20:25:58.712170 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.712149 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sbxm6_eef75401-5fc2-4705-9f86-365e393d4977/kube-rbac-proxy/0.log" Apr 22 20:25:58.731603 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.731586 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sbxm6_eef75401-5fc2-4705-9f86-365e393d4977/init-textfile/0.log" Apr 22 20:25:58.756749 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.756711 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4xctw_aaecdab7-4807-4bae-baab-35520e039402/kube-rbac-proxy-main/0.log" Apr 22 20:25:58.776851 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.776756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4xctw_aaecdab7-4807-4bae-baab-35520e039402/kube-rbac-proxy-self/0.log" Apr 22 20:25:58.801528 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:58.801505 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4xctw_aaecdab7-4807-4bae-baab-35520e039402/openshift-state-metrics/0.log" Apr 22 20:25:59.182547 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.182522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/thanos-query/0.log" Apr 22 20:25:59.205028 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.205002 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/kube-rbac-proxy-web/0.log" Apr 22 20:25:59.231683 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.231657 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/kube-rbac-proxy/0.log" Apr 22 20:25:59.264376 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.264347 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/prom-label-proxy/0.log" Apr 22 20:25:59.302092 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.302065 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/kube-rbac-proxy-rules/0.log" Apr 22 20:25:59.330343 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:25:59.330323 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5549b6dc-7hqc7_42375e8a-9780-410e-882b-f1a6d3d82976/kube-rbac-proxy-metrics/0.log" Apr 22 20:26:01.411902 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.411870 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn"] Apr 22 20:26:01.415174 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.415151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.417498 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.417462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9c5pk\"/\"default-dockercfg-lz96s\"" Apr 22 20:26:01.418824 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.418803 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"kube-root-ca.crt\"" Apr 22 20:26:01.418932 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.418892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9c5pk\"/\"openshift-service-ca.crt\"" Apr 22 20:26:01.420620 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.420599 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn"] Apr 22 20:26:01.451480 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.451447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-sys\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.451480 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.451475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-lib-modules\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.451669 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.451502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-podres\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.451669 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.451590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-proc\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.451669 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.451640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5t8q\" (UniqueName: \"kubernetes.io/projected/3e19178b-995e-41e6-8601-22f91cb7127c-kube-api-access-f5t8q\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552102 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-proc\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552118 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5t8q\" (UniqueName: \"kubernetes.io/projected/3e19178b-995e-41e6-8601-22f91cb7127c-kube-api-access-f5t8q\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-sys\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-lib-modules\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-proc\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-podres\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552285 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-sys\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552505 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-podres\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.552505 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.552313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e19178b-995e-41e6-8601-22f91cb7127c-lib-modules\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.560898 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.560866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5t8q\" (UniqueName: \"kubernetes.io/projected/3e19178b-995e-41e6-8601-22f91cb7127c-kube-api-access-f5t8q\") pod \"perf-node-gather-daemonset-fqkqn\" (UID: \"3e19178b-995e-41e6-8601-22f91cb7127c\") " pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.603578 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.603551 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-bf5vp_8a37d76d-bb5f-4997-83f8-5da1496ba0e9/volume-data-source-validator/0.log" Apr 22 20:26:01.725881 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.725803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:01.845314 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.845288 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn"] Apr 22 20:26:01.847572 ip-10-0-135-144 kubenswrapper[2574]: W0422 20:26:01.847532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e19178b_995e_41e6_8601_22f91cb7127c.slice/crio-2f2c9df4d8664c5708f375b4424af29269704e5b26ee2f14a396f77473aa20ed WatchSource:0}: Error finding container 2f2c9df4d8664c5708f375b4424af29269704e5b26ee2f14a396f77473aa20ed: Status 404 returned error can't find the container with id 2f2c9df4d8664c5708f375b4424af29269704e5b26ee2f14a396f77473aa20ed Apr 22 20:26:01.849140 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.849123 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:26:01.905770 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.905741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" event={"ID":"3e19178b-995e-41e6-8601-22f91cb7127c","Type":"ContainerStarted","Data":"bf036940facfb478147c74f84778f5d0f4bea1499b884a2dcf53621ead5440fd"} Apr 22 20:26:01.905865 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:01.905775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" event={"ID":"3e19178b-995e-41e6-8601-22f91cb7127c","Type":"ContainerStarted","Data":"2f2c9df4d8664c5708f375b4424af29269704e5b26ee2f14a396f77473aa20ed"} Apr 22 20:26:02.353381 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.353350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbr82_d9acd301-c698-4dda-b95a-e48e9dfbf761/dns/0.log" Apr 22 20:26:02.372268 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.372246 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rbr82_d9acd301-c698-4dda-b95a-e48e9dfbf761/kube-rbac-proxy/0.log" Apr 22 20:26:02.392945 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.392925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqg2m_3ab42ccb-aaa7-4fc9-ba6b-0440f5b86ef2/dns-node-resolver/0.log" Apr 22 20:26:02.834669 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.834642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6f46764956-vbbf4_675eace7-6336-4022-a08d-68006d2cbe80/registry/0.log" Apr 22 20:26:02.878313 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.878278 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b9dhn_277e1d66-9594-42a8-b953-7fcddeac7dad/node-ca/0.log" Apr 22 20:26:02.908690 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.908655 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:02.925222 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:02.925177 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" podStartSLOduration=1.925163152 podStartE2EDuration="1.925163152s" podCreationTimestamp="2026-04-22 20:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:26:02.923059627 +0000 UTC m=+3751.612082764" watchObservedRunningTime="2026-04-22 20:26:02.925163152 +0000 UTC m=+3751.614186321" Apr 22 20:26:03.906101 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:03.906068 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tcrbq_79e5fa54-5efe-4b12-a3d1-c4a889502cea/serve-healthcheck-canary/0.log" Apr 22 20:26:04.258298 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:04.258211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dnrn8_38aa17f7-ede3-4506-a743-892303b3d9b7/insights-operator/0.log" Apr 22 20:26:04.258997 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:04.258977 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dnrn8_38aa17f7-ede3-4506-a743-892303b3d9b7/insights-operator/1.log" Apr 22 20:26:04.404007 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:04.403978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbxjf_eaab3bc0-fc81-44b7-83fc-a0b27939da1a/kube-rbac-proxy/0.log" Apr 22 20:26:04.423651 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:04.423622 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbxjf_eaab3bc0-fc81-44b7-83fc-a0b27939da1a/exporter/0.log" Apr 22 20:26:04.444039 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:04.444005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sbxjf_eaab3bc0-fc81-44b7-83fc-a0b27939da1a/extractor/0.log" Apr 22 20:26:06.447115 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:06.447083 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-545d8995fb-d6jd2_c00196bd-d0de-4482-b0ad-6948288211b3/manager/0.log" Apr 22 20:26:06.466702 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:06.466678 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-kcz2n_cf162d4e-ec56-421a-ac73-a69be845984a/manager/0.log" Apr 22 20:26:06.955234 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:06.955206 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-kbxgc_4ab439ba-7881-4aab-acc5-f8cb88d19978/seaweedfs/0.log" Apr 22 20:26:06.977180 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:06.977153 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-b85g5_084ba333-869f-4315-a656-2e9b1fbf99c7/seaweedfs-tls-custom/0.log" Apr 22 20:26:08.921703 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:08.921674 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9c5pk/perf-node-gather-daemonset-fqkqn" Apr 22 20:26:12.293905 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.293874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/kube-multus-additional-cni-plugins/0.log" Apr 22 20:26:12.314222 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.314195 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/egress-router-binary-copy/0.log" Apr 22 20:26:12.337351 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.337286 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/cni-plugins/0.log" Apr 22 20:26:12.355136 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.355118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/bond-cni-plugin/0.log" Apr 22 20:26:12.375413 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.375387 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/routeoverride-cni/0.log" Apr 22 20:26:12.397786 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.397767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/whereabouts-cni-bincopy/0.log" Apr 22 20:26:12.419130 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.419110 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-grxvt_eb28e67f-4312-4175-a5fa-26a033fdf402/whereabouts-cni/0.log" Apr 22 20:26:12.606903 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.606829 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mz5vn_dbd63835-2911-4f84-8572-eceb35993627/kube-multus/0.log" Apr 22 20:26:12.711545 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.711516 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rwxr2_3f7c0766-21b4-4016-9a86-f022651a4b2e/network-metrics-daemon/0.log" Apr 22 20:26:12.761461 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:12.761431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rwxr2_3f7c0766-21b4-4016-9a86-f022651a4b2e/kube-rbac-proxy/0.log" Apr 22 20:26:14.389696 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.389669 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-controller/0.log" Apr 22 20:26:14.410149 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.410120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/0.log" Apr 22 20:26:14.426647 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.426617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovn-acl-logging/1.log" Apr 22 20:26:14.448508 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.448480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/kube-rbac-proxy-node/0.log" Apr 22 20:26:14.469305 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.469271 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:26:14.488381 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.488353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/northd/0.log" Apr 22 20:26:14.510421 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.510400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/nbdb/0.log" Apr 22 20:26:14.534013 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.533983 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/sbdb/0.log" Apr 22 20:26:14.630271 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:14.630240 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rffw9_001543e3-9932-4f51-a285-c188ebe53071/ovnkube-controller/0.log" Apr 22 20:26:15.616754 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:15.616713 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lvq7b_f5bffc3c-5e52-4276-99b2-b8ec48bdd66a/check-endpoints/0.log" Apr 22 20:26:15.640326 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:15.640301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fhzh7_189e1287-287e-4d9e-aabb-f15459c7ac43/network-check-target-container/0.log" Apr 22 20:26:16.533141 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:16.533111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5zp6w_ab5f4389-6ac2-4eab-b05a-9657f9124db1/iptables-alerter/0.log" Apr 22 20:26:17.235777 ip-10-0-135-144 kubenswrapper[2574]: I0422 20:26:17.235750 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dtzjz_ed659788-ce5c-4f08-b7a2-84ca2fdda6df/tuned/0.log"