Apr 16 20:11:55.435887 ip-10-0-142-60 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:55.903829 ip-10-0-142-60 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:55.903829 ip-10-0-142-60 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:55.903829 ip-10-0-142-60 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:55.903829 ip-10-0-142-60 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:55.903829 ip-10-0-142-60 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:55.906202 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.906117 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:55.910176 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910162 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:55.910176 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910176 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910180 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910184 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910187 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910189 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910192 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910195 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910198 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910201 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910204 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910206 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910209 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910218 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910221 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910224 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910226 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910229 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910232 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910234 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910237 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:55.910240 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910240 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910242 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910245 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910248 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910253 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910256 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910259 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910262 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910265 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910269 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910274 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910277 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910279 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910282 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910284 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910287 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910290 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910292 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910295 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:55.910729 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910298 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910300 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910303 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910305 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910308 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910310 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910313 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910316 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910318 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910321 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910323 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910326 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910328 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910331 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910333 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910336 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910339 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910342 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910344 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910347 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910350 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:55.911208 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910352 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910355 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910357 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910361 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910363 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910366 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910368 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910371 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910373 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910375 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910378 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910380 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910383 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910385 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910388 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910390 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910392 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910395 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910398 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910400 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:55.911744 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910402 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910405 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910408 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910410 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910413 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910794 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910800 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910803 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910805 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910808 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910811 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910814 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910817 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910819 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910822 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910824 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910827 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910830 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910833 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910835 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:55.912219 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910838 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910840 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910843 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910845 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910848 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910850 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910853 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910855 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910858 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910861 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910864 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910867 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910870 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910872 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910874 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910877 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910879 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910882 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910885 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910888 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:55.912718 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910890 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910893 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910896 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910898 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910901 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910903 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910906 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910908 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910911 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910913 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910915 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910918 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910920 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910923 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910925 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910927 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910930 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910932 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910935 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910937 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:55.913216 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910940 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910942 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910944 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910947 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910950 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910952 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910955 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910957 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910960 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910963 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910968 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910971 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910974 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910976 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910979 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910982 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910985 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910987 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910989 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:55.913714 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910992 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910996 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.910999 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911003 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911005 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911008 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911011 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911013 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911016 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911018 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911021 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.911023 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911760 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911773 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911780 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911784 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911789 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911792 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911797 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911801 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:55.914203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911805 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911808 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911811 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911817 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911820 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911823 2566 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911826 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911829 2566 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911832 2566 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911835 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911838 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911844 2566 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911847 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911850 2566 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911853 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911857 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911861 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911864 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911867 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911870 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911873 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911876 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911879 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911882 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911885 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:55.914711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911890 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911893 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911896 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911899 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911902 2566 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911910 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911917 2566 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911921 2566 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911924 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911928 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911931 2566 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911935 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911938 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911941 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911944 2566 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911947 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911950 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911953 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911956 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911959 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911962 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911964 2566 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911968 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911971 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911974 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:55.915308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911977 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911980 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911983 2566 flags.go:64] FLAG: --help="false" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911986 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-142-60.ec2.internal" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911989 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911992 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911996 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.911999 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912003 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912006 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912009 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912011 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912014 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912023 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912026 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912029 2566 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912032 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912035 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912038 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912040 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912043 2566 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912046 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912050 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912053 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:55.915970 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912058 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912061 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912064 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912067 2566 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912069 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912073 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912076 2566 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912079 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912083 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912086 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912090 2566 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912093 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912096 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912099 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912102 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912105 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912108 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912111 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912119 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912122 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912124 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912127 2566 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912146 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:55.916547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912151 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912154 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912157 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912160 2566 flags.go:64] FLAG: --port="10250" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912164 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912166 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09df5e60cf2c79cc2" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912169 2566 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912172 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912175 2566 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912178 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912181 2566 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912189 2566 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912192 2566 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912195 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912197 2566 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912201 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912204 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912207 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912214 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912217 2566 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912220 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912223 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912226 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912229 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912234 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912237 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:55.917172 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912240 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912244 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912246 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912249 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912252 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912255 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912264 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912267 2566 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912270 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912276 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912278 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912281 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912287 2566 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912290 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912293 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912296 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912299 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912301 2566 flags.go:64] FLAG: --v="2" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912306 2566 flags.go:64] FLAG: --version="false" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912310 2566 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912314 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.912317 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912423 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912428 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:55.917814 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912433 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912436 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912439 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912442 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912444 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912447 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912451 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912453 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912456 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912458 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912461 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912464 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912466 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912469 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912472 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912477 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912480 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912483 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912486 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912488 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:55.918386 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912490 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912493 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912496 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912498 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912501 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912503 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912506 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912510 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912513 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912516 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912519 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912521 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912525 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912528 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912530 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912532 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912535 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912538 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912541 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:55.918952 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912544 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912546 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912549 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912552 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912555 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912557 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912559 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912562 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912564 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912573 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912575 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912578 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912580 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912583 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912585 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912588 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912590 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912593 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912595 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912611 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:55.919535 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912614 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912617 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912619 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912621 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912624 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912628 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912631 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912634 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912636 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912639 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912642 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912649 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912651 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912654 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912657 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912659 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912662 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912665 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912667 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912670 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:55.920220 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912672 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:55.921001 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912675 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:55.921001 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912678 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:55.921001 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912681 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:55.921001 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.912683 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:55.921001 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.913473 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:55.924648 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.924625 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:55.924648 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.924647 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924721 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924729 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924734 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924739 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924744 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924748 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924752 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924759 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924766 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924771 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924777 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924781 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924785 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924789 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924793 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:55.924789 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924798 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924802 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924806 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924810 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924814 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924818 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924822 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924826 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924830 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924836 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924840 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924845 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924849 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924853 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924857 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924861 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924865 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924870 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924874 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924878 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:55.925501 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924882 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924886 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924890 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924894 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924898 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924902 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924906 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924910 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924915 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924919 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924923 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924928 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924932 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924936 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924941 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924945 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924951 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924955 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924959 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924963 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:55.926152 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924967 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924972 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924976 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924980 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924985 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924990 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924994 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.924998 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925002 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925007 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925013 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925018 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925022 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925026 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925031 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925035 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925039 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925043 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925048 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925052 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:55.926742 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925057 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925070 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925074 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925079 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925083 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925087 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925091 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925096 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925100 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925104 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925108 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.925116 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925276 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925285 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925289 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925294 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:55.927414 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925298 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925303 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925307 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925311 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925316 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925320 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925325 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925329 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925333 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925337 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925341 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925346 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925350 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925355 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925359 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925363 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925367 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925371 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925377 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925381 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:55.927975 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925386 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925390 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925394 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925399 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925403 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925407 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925411 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925415 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925419 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925423 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925427 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925432 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925436 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925440 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925444 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925448 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925452 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925457 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925461 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925466 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:55.928611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925470 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925474 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925478 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925483 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925487 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925491 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925495 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925499 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925503 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925507 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925511 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925516 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925519 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925524 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925528 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925533 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925537 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925541 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925546 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925550 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:55.929434 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925554 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925561 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925566 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925571 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925576 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925581 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925585 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925590 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925594 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925614 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925619 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925624 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925628 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925631 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925636 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925640 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925644 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925648 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925655 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:55.930016 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925660 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:55.930551 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925664 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:55.930551 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:55.925668 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:55.930551 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.925676 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:55.930551 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.926464 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:55.930551 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.929980 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:55.931064 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.931051 2566 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:55.931168 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.931151 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:55.931205 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.931194 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:55.959147 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.959119 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:55.967313 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:55.967292 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:56.004356 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.004337 2566 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:56.012768 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.012750 2566 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:56.014059 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.014043 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:56.017297 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.017281 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:56.018208 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.018185 2566 fs.go:135] Filesystem UUIDs: map[08bb4658-dd1c-4029-9b3b-df1673bc5fa2:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 7eda031c-7f9e-4d7e-acdb-7af0a575d0b7:/dev/nvme0n1p4] Apr 16 20:11:56.018262 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.018208 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:56.025562 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.025451 2566 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:56.023086598 +0000 UTC m=+0.462731560 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099929 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2da16e792a07a7f1f42e42d0c5ee61 SystemUUID:ec2da16e-792a-07a7-f1f4-2e42d0c5ee61 BootID:8b464e62-b917-418c-bced-5a44b3b08b62 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a1:07:c1:a1:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a1:07:c1:a1:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:f2:af:b2:e6:06 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:56.025562 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.025549 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:56.025734 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.025651 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:56.026683 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.026657 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:56.026849 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.026686 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-60.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:56.026926 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.026863 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:56.026926 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.026875 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:56.026926 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.026894 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:56.027821 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.027809 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:56.029150 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.029137 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:56.029279 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.029268 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:56.031764 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.031753 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:56.031823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.031772 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:56.031823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.031788 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:56.031823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.031801 2566 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:56.031823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.031814 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:56.032802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.032786 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:56.032868 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.032812 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:56.035832 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.035815 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:56.037514 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.037501 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:56.038883 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038867 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038892 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038901 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038907 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038914 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038919 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038925 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:56.038935 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038931 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:56.039127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038939 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:56.039127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038946 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:56.039127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038955 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:56.039127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038964 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:56.039127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.038902 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kqkn9" Apr 16 20:11:56.041004 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.040990 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:56.041048 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.041006 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:56.044287 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.044262 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-60.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:56.044377 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.044350 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:56.044483 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044468 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-60.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:56.044673 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044663 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:56.044715 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044698 2566 server.go:1295] "Started kubelet" Apr 16 20:11:56.044806 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044781 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:56.044840 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044784 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:56.044873 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.044849 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:56.045492 ip-10-0-142-60 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:56.046715 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.046695 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:56.047400 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.047389 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:56.049628 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.049596 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kqkn9" Apr 16 20:11:56.051032 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.051011 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:56.051658 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.051641 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:56.051901 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.050872 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-60.ec2.internal.18a6ef640d642238 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-60.ec2.internal,UID:ip-10-0-142-60.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-60.ec2.internal,},FirstTimestamp:2026-04-16 20:11:56.044673592 +0000 UTC m=+0.484318548,LastTimestamp:2026-04-16 20:11:56.044673592 +0000 UTC m=+0.484318548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-60.ec2.internal,}" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052260 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052265 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052287 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052380 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052389 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052414 2566 factory.go:55] Registering systemd factory Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052431 2566 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.052465 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052679 2566 factory.go:153] Registering CRI-O factory Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052693 2566 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052740 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052786 2566 factory.go:103] Registering Raw factory Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.052800 2566 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:56.061289 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.053401 2566 manager.go:319] Starting recovery of all containers Apr 16 20:11:56.062632 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.062461 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:56.065787 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.065765 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-60.ec2.internal\" not found" node="ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.065893 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.065882 2566 manager.go:324] Recovery completed Apr 16 20:11:56.070068 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.070055 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.072529 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.072427 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.072529 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.072454 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.072529 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.072465 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.072996 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.072981 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:56.072996 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.072995 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:56.073087 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.073019 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:56.075793 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.075781 2566 policy_none.go:49] "None policy: Start" Apr 16 20:11:56.075832 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.075797 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:56.075832 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.075806 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:56.117161 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117142 2566 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.117190 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117205 2566 server.go:85] "Starting device plugin registration server" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117408 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117417 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117531 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117629 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.117640 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.118076 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:56.126760 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.118117 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.217784 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.217719 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.218721 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.218706 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.218809 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.218734 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.218809 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.218751 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.218809 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.218783 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.231658 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.231642 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.231703 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.231662 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-60.ec2.internal\": node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.242417 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.242391 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:56.243488 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.243473 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:56.243543 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.243499 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:56.243543 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.243517 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:56.243543 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.243523 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:56.243678 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.243554 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:56.246259 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.246240 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:56.249126 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.249110 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.344013 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.343978 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal"] Apr 16 20:11:56.344123 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.344057 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.344875 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.344860 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.344956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.344886 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.344956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.344896 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.346115 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346103 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.346258 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346244 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.346298 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346273 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.346754 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346739 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.346816 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346761 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.346816 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346773 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.346816 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346799 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.346911 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346820 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.346911 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.346830 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.348012 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.347996 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.348073 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.348029 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:56.348666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.348639 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:56.348666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.348666 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:56.348790 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.348689 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:56.349807 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.349795 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.353376 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.353361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.353442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.353384 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4950d98dd76e149e921ac7e36cab051b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-60.ec2.internal\" (UID: \"4950d98dd76e149e921ac7e36cab051b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.353442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.353399 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.385194 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.385174 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-60.ec2.internal\" not found" node="ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.389383 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.389369 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-60.ec2.internal\" not found" node="ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.450218 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.450197 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.453964 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.453946 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4950d98dd76e149e921ac7e36cab051b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-60.ec2.internal\" (UID: \"4950d98dd76e149e921ac7e36cab051b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.454044 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.453971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.454044 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.453987 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.454044 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.454011 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.454044 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.454040 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4950d98dd76e149e921ac7e36cab051b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-60.ec2.internal\" (UID: \"4950d98dd76e149e921ac7e36cab051b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.454167 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.454052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ab040f816c3a4b4dcd7ff7115ed9725d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal\" (UID: \"ab040f816c3a4b4dcd7ff7115ed9725d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.550906 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.550823 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.651252 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.651231 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.687673 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.687647 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.691286 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.691271 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.751965 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.751939 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.852423 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:56.852375 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-60.ec2.internal\" not found" Apr 16 20:11:56.900910 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.900886 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:56.931080 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.931059 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:56.931638 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.931192 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:56.931638 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.931215 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:56.931638 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.931218 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:56.952218 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.952187 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.970430 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.970410 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:56.972217 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.972203 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" Apr 16 20:11:56.980566 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:56.980551 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:57.032670 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.032645 2566 apiserver.go:52] "Watching apiserver" Apr 16 20:11:57.043851 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.043826 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:57.044249 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.044229 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp","openshift-cluster-node-tuning-operator/tuned-68ppm","openshift-dns/node-resolver-q76dr","openshift-image-registry/node-ca-5wvp4","openshift-multus/multus-additional-cni-plugins-4llv8","openshift-multus/multus-s4bbh","openshift-multus/network-metrics-daemon-jdfnl","kube-system/konnectivity-agent-kzrvj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal","openshift-network-diagnostics/network-check-target-tkpmf","openshift-network-operator/iptables-alerter-qmblv","openshift-ovn-kubernetes/ovnkube-node-8lrq8"] Apr 16 20:11:57.046240 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.046224 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.047423 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.047403 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.048299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.048281 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.048299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.048293 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.048442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.048320 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qhfrq\"" Apr 16 20:11:57.048442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.048420 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:57.049016 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.048997 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.049480 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.049462 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.049564 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.049497 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.049564 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.049503 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zmnkz\"" Apr 16 20:11:57.050686 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.050569 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.050884 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.050740 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.051008 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.050992 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.051068 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.051046 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v9s6x\"" Apr 16 20:11:57.051134 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.051104 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:57.051742 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.051719 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:56 +0000 UTC" deadline="2027-12-14 08:09:56.07674352 +0000 UTC" Apr 16 20:11:57.051742 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.051740 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14555h57m59.025006137s" Apr 16 20:11:57.051885 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.051832 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.052250 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.052235 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mlz62\"" Apr 16 20:11:57.052718 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.052690 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.052801 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.052720 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:57.052857 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.052817 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.053465 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.053446 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.053700 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.053680 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.054003 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.053990 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zj42d\"" Apr 16 20:11:57.054314 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.054293 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:57.054442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.054317 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:57.054538 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.054523 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:57.054619 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.054579 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.054856 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.054839 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.054924 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.054906 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:11:57.055496 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.055480 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:57.055713 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.055497 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-c8fnz\"" Apr 16 20:11:57.055993 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.055979 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-os-release\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.056033 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056004 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-socket-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.056033 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056020 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-device-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.056109 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056034 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-modprobe-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056109 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056050 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-conf\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056109 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-run\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056109 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056096 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-lib-modules\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056125 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-tmp\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056175 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-tuned\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-tmp-dir\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-serviceca\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-cnibin\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.056299 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056316 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056339 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-cnibin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056360 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-os-release\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056363 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-etc-selinux\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056434 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-sys\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056461 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-k8s-cni-cncf-io\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056492 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-netns\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-daemon-config\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.056531 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056532 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-multus-certs\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056617 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-sys-fs\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056646 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-hosts-file\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056661 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-socket-dir-parent\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056679 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-multus\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056707 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-kubelet\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056724 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-etc-kubernetes\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sc9\" (UniqueName: \"kubernetes.io/projected/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-kube-api-access-87sc9\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056762 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysconfig\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056775 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-registration-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056789 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfv4\" (UniqueName: \"kubernetes.io/projected/6789abf2-1a59-4d55-9f2d-c976b4762dab-kube-api-access-hxfv4\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-system-cni-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056839 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-system-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056879 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-kubernetes\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056901 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-systemd\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056932 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.057024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056959 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n8w5\" (UniqueName: \"kubernetes.io/projected/6321d139-42ba-4ad4-96d3-6dafabbdc869-kube-api-access-5n8w5\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.056985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-cni-binary-copy\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057011 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-bin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057032 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-hostroot\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057068 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057103 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-var-lib-kubelet\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057126 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-host\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057142 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldkjr\" (UniqueName: \"kubernetes.io/projected/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-kube-api-access-ldkjr\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-binary-copy\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057192 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-conf-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057214 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxx6\" (UniqueName: \"kubernetes.io/projected/aae5927d-11b8-46a5-a3e9-c3be8d357974-kube-api-access-smxx6\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057241 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057256 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl9bc\" (UniqueName: \"kubernetes.io/projected/3b621d74-7f5b-47e2-afbb-a2c1610adc49-kube-api-access-nl9bc\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.057528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-host\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.057912 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.057547 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:57.057912 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.057594 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:11:57.058270 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.058252 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:57.058364 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.058343 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-99qj6\"" Apr 16 20:11:57.058413 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.058353 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:57.058887 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.058874 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.060391 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.060368 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.060724 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.060571 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.060873 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.060841 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:57.060958 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.060857 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.060958 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.060948 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4j5tt\"" Apr 16 20:11:57.062239 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.062222 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:57.062342 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.062225 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:57.063504 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063190 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:57.063504 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063264 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:57.063504 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063284 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:57.063734 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063715 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rn2s6\"" Apr 16 20:11:57.063954 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063817 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:57.063954 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.063837 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:57.086517 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.086492 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvpsp" Apr 16 20:11:57.097458 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.097438 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvpsp" Apr 16 20:11:57.152570 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.152528 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab040f816c3a4b4dcd7ff7115ed9725d.slice/crio-6d4f7dc427f392099eb59b4c406dff3cc35d94d9f82b022822a42055d608e2ae WatchSource:0}: Error finding container 6d4f7dc427f392099eb59b4c406dff3cc35d94d9f82b022822a42055d608e2ae: Status 404 returned error can't find the container with id 6d4f7dc427f392099eb59b4c406dff3cc35d94d9f82b022822a42055d608e2ae Apr 16 20:11:57.152993 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.152978 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:57.157392 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-host\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157400 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldkjr\" (UniqueName: \"kubernetes.io/projected/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-kube-api-access-ldkjr\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.157467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smxx6\" (UniqueName: \"kubernetes.io/projected/aae5927d-11b8-46a5-a3e9-c3be8d357974-kube-api-access-smxx6\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.157467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157426 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:57.157467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157438 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-ovn\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.157467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157483 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-host\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157509 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl9bc\" (UniqueName: \"kubernetes.io/projected/3b621d74-7f5b-47e2-afbb-a2c1610adc49-kube-api-access-nl9bc\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-systemd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157576 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157624 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-device-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157647 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157649 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-modprobe-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157701 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-conf\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.157719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-device-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157729 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-run\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-lib-modules\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157777 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-tmp-dir\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157786 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-modprobe-d\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157804 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-node-log\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-run\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysctl-conf\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-log-socket\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157882 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-lib-modules\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-tmp-dir\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158334 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.157891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-tuned\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-serviceca\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158488 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-os-release\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158565 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-config\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.158666 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-etc-selinux\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158699 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-os-release\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158780 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-netns\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158811 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-daemon-config\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158783 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-etc-selinux\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75sm\" (UniqueName: \"kubernetes.io/projected/02d874be-6206-4feb-99d1-3539318d290b-kube-api-access-f75sm\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158901 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-netns\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-env-overrides\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158979 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-kubelet\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.158994 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-serviceca\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-etc-kubernetes\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159017 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-kubelet\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159045 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88690dc9-b75a-4009-b53f-717dd6e43bda-iptables-alerter-script\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159076 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-etc-kubernetes\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159072 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88690dc9-b75a-4009-b53f-717dd6e43bda-host-slash\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159105 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.159155 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-netd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159154 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-slash\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159179 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bngnk\" (UniqueName: \"kubernetes.io/projected/9d24ecb8-b036-4a27-8ff3-1283740a16d5-kube-api-access-bngnk\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159209 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159228 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-systemd\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5n8w5\" (UniqueName: \"kubernetes.io/projected/6321d139-42ba-4ad4-96d3-6dafabbdc869-kube-api-access-5n8w5\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159295 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159321 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-binary-copy\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-systemd\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159371 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-daemon-config\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159398 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159945 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-cni-binary-copy\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-conf-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.159998 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-conf-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160003 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-host\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.160047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160040 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-os-release\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160069 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a90e7ceb-6160-4490-81d3-0bf334a5861e-konnectivity-ca\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160089 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-os-release\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160095 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-netns\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-host\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160123 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-socket-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-tmp\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160163 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-cnibin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-multus\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160267 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a90e7ceb-6160-4490-81d3-0bf334a5861e-agent-certs\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160284 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-socket-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-cnibin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160346 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-cnibin\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160291 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-cnibin\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160386 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-multus\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.160802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-multus-certs\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160484 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-systemd-units\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-multus-certs\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160556 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovn-node-metrics-cert\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-sys\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160640 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160667 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-k8s-cni-cncf-io\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160710 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-etc-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-run-k8s-cni-cncf-io\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160755 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-sys-fs\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160788 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-hosts-file\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160794 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-sys-fs\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160814 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-socket-dir-parent\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160864 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-sys\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160881 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzjs\" (UniqueName: \"kubernetes.io/projected/88690dc9-b75a-4009-b53f-717dd6e43bda-kube-api-access-gfzjs\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-hosts-file\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.161728 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-multus-socket-dir-parent\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160930 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-var-lib-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-bin\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.160995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87sc9\" (UniqueName: \"kubernetes.io/projected/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-kube-api-access-87sc9\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161022 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysconfig\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-registration-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161081 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfv4\" (UniqueName: \"kubernetes.io/projected/6789abf2-1a59-4d55-9f2d-c976b4762dab-kube-api-access-hxfv4\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-system-cni-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161121 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-sysconfig\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161133 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-system-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161159 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-kubernetes\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161210 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-cni-binary-copy\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-bin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161255 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-hostroot\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161268 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6789abf2-1a59-4d55-9f2d-c976b4762dab-registration-dir\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161277 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-kubelet\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.162590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161300 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-script-lib\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161318 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-host-var-lib-cni-bin\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-var-lib-kubelet\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-system-cni-dir\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161355 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6321d139-42ba-4ad4-96d3-6dafabbdc869-system-cni-dir\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161407 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-var-lib-kubelet\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161444 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-kubernetes\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aae5927d-11b8-46a5-a3e9-c3be8d357974-cni-binary-copy\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161869 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6321d139-42ba-4ad4-96d3-6dafabbdc869-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.161930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aae5927d-11b8-46a5-a3e9-c3be8d357974-hostroot\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.162392 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-etc-tuned\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.163765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.162531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3b621d74-7f5b-47e2-afbb-a2c1610adc49-tmp\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.165782 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.165435 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl9bc\" (UniqueName: \"kubernetes.io/projected/3b621d74-7f5b-47e2-afbb-a2c1610adc49-kube-api-access-nl9bc\") pod \"tuned-68ppm\" (UID: \"3b621d74-7f5b-47e2-afbb-a2c1610adc49\") " pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.169781 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.169759 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxx6\" (UniqueName: \"kubernetes.io/projected/aae5927d-11b8-46a5-a3e9-c3be8d357974-kube-api-access-smxx6\") pod \"multus-s4bbh\" (UID: \"aae5927d-11b8-46a5-a3e9-c3be8d357974\") " pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.169859 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.169818 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldkjr\" (UniqueName: \"kubernetes.io/projected/3fe6aa55-9c5e-4ed7-bd31-4790e51c271b-kube-api-access-ldkjr\") pod \"node-ca-5wvp4\" (UID: \"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b\") " pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.172647 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.172623 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n8w5\" (UniqueName: \"kubernetes.io/projected/6321d139-42ba-4ad4-96d3-6dafabbdc869-kube-api-access-5n8w5\") pod \"multus-additional-cni-plugins-4llv8\" (UID: \"6321d139-42ba-4ad4-96d3-6dafabbdc869\") " pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.172750 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.172705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sc9\" (UniqueName: \"kubernetes.io/projected/87948db0-f0f9-46ff-ad52-0b6cb7a17f42-kube-api-access-87sc9\") pod \"node-resolver-q76dr\" (UID: \"87948db0-f0f9-46ff-ad52-0b6cb7a17f42\") " pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.173357 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.173333 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfv4\" (UniqueName: \"kubernetes.io/projected/6789abf2-1a59-4d55-9f2d-c976b4762dab-kube-api-access-hxfv4\") pod \"aws-ebs-csi-driver-node-r2gjp\" (UID: \"6789abf2-1a59-4d55-9f2d-c976b4762dab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.188813 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.188795 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4950d98dd76e149e921ac7e36cab051b.slice/crio-0f140a336016d561059d16c453feb0bb07958c56e6306222213b03caec2a4e7c WatchSource:0}: Error finding container 0f140a336016d561059d16c453feb0bb07958c56e6306222213b03caec2a4e7c: Status 404 returned error can't find the container with id 0f140a336016d561059d16c453feb0bb07958c56e6306222213b03caec2a4e7c Apr 16 20:11:57.246760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.246724 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" event={"ID":"4950d98dd76e149e921ac7e36cab051b","Type":"ContainerStarted","Data":"0f140a336016d561059d16c453feb0bb07958c56e6306222213b03caec2a4e7c"} Apr 16 20:11:57.247594 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.247578 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" event={"ID":"ab040f816c3a4b4dcd7ff7115ed9725d","Type":"ContainerStarted","Data":"6d4f7dc427f392099eb59b4c406dff3cc35d94d9f82b022822a42055d608e2ae"} Apr 16 20:11:57.261703 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261686 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-node-log\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261784 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-log-socket\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261784 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261726 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261784 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261741 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-config\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261878 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261878 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261804 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-node-log\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261878 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261791 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-log-socket\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.261878 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f75sm\" (UniqueName: \"kubernetes.io/projected/02d874be-6206-4feb-99d1-3539318d290b-kube-api-access-f75sm\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.261878 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261871 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261898 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-env-overrides\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261924 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88690dc9-b75a-4009-b53f-717dd6e43bda-iptables-alerter-script\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261947 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88690dc9-b75a-4009-b53f-717dd6e43bda-host-slash\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.261995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-netd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262042 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262019 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-slash\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262042 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88690dc9-b75a-4009-b53f-717dd6e43bda-host-slash\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bngnk\" (UniqueName: \"kubernetes.io/projected/9d24ecb8-b036-4a27-8ff3-1283740a16d5-kube-api-access-bngnk\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262089 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-netd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262104 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a90e7ceb-6160-4490-81d3-0bf334a5861e-konnectivity-ca\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262141 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-netns\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262143 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-slash\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a90e7ceb-6160-4490-81d3-0bf334a5861e-agent-certs\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262203 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-run-netns\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262226 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-systemd-units\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262231 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-config\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262294 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.262275 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262305 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovn-node-metrics-cert\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262353 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-etc-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262364 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-env-overrides\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262394 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-etc-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.262402 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.762361208 +0000 UTC m=+2.202006170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-systemd-units\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262472 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/88690dc9-b75a-4009-b53f-717dd6e43bda-iptables-alerter-script\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262470 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzjs\" (UniqueName: \"kubernetes.io/projected/88690dc9-b75a-4009-b53f-717dd6e43bda-kube-api-access-gfzjs\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-var-lib-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-bin\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262580 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-var-lib-openvswitch\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262584 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-cni-bin\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-kubelet\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-kubelet\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262652 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a90e7ceb-6160-4490-81d3-0bf334a5861e-konnectivity-ca\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262661 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-script-lib\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.262919 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262735 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-ovn\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262766 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-systemd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-ovn\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262794 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-run-systemd\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.262858 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d24ecb8-b036-4a27-8ff3-1283740a16d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.263426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.263124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovnkube-script-lib\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.264358 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.264339 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d24ecb8-b036-4a27-8ff3-1283740a16d5-ovn-node-metrics-cert\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.264458 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.264445 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a90e7ceb-6160-4490-81d3-0bf334a5861e-agent-certs\") pod \"konnectivity-agent-kzrvj\" (UID: \"a90e7ceb-6160-4490-81d3-0bf334a5861e\") " pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.270362 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.270341 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:57.270362 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.270359 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:57.270489 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.270369 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.270489 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.270421 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.770408185 +0000 UTC m=+2.210053141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.272048 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.272031 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bngnk\" (UniqueName: \"kubernetes.io/projected/9d24ecb8-b036-4a27-8ff3-1283740a16d5-kube-api-access-bngnk\") pod \"ovnkube-node-8lrq8\" (UID: \"9d24ecb8-b036-4a27-8ff3-1283740a16d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.272246 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.272228 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzjs\" (UniqueName: \"kubernetes.io/projected/88690dc9-b75a-4009-b53f-717dd6e43bda-kube-api-access-gfzjs\") pod \"iptables-alerter-qmblv\" (UID: \"88690dc9-b75a-4009-b53f-717dd6e43bda\") " pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.272297 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.272282 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75sm\" (UniqueName: \"kubernetes.io/projected/02d874be-6206-4feb-99d1-3539318d290b-kube-api-access-f75sm\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.370818 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.370760 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" Apr 16 20:11:57.376781 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.376756 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6789abf2_1a59_4d55_9f2d_c976b4762dab.slice/crio-9a74281575ee326081fabe040dda4ccc8e6ba22190b14b64a9638b0dd1d4bfe3 WatchSource:0}: Error finding container 9a74281575ee326081fabe040dda4ccc8e6ba22190b14b64a9638b0dd1d4bfe3: Status 404 returned error can't find the container with id 9a74281575ee326081fabe040dda4ccc8e6ba22190b14b64a9638b0dd1d4bfe3 Apr 16 20:11:57.384632 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.384614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-68ppm" Apr 16 20:11:57.390613 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.390580 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b621d74_7f5b_47e2_afbb_a2c1610adc49.slice/crio-ca4ab9dccf46ca83232a101ce183a456a5cfe7a3e72c0f4c10809bd0e13f2e8b WatchSource:0}: Error finding container ca4ab9dccf46ca83232a101ce183a456a5cfe7a3e72c0f4c10809bd0e13f2e8b: Status 404 returned error can't find the container with id ca4ab9dccf46ca83232a101ce183a456a5cfe7a3e72c0f4c10809bd0e13f2e8b Apr 16 20:11:57.398825 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.398810 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q76dr" Apr 16 20:11:57.398905 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.398817 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5wvp4" Apr 16 20:11:57.407242 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.407213 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87948db0_f0f9_46ff_ad52_0b6cb7a17f42.slice/crio-89c154c08b906bcca73c95ca2387fe0476a9b087d9ad406efb2ab6c97459ca68 WatchSource:0}: Error finding container 89c154c08b906bcca73c95ca2387fe0476a9b087d9ad406efb2ab6c97459ca68: Status 404 returned error can't find the container with id 89c154c08b906bcca73c95ca2387fe0476a9b087d9ad406efb2ab6c97459ca68 Apr 16 20:11:57.407769 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.407748 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe6aa55_9c5e_4ed7_bd31_4790e51c271b.slice/crio-5b1d5c42f69aa47fdfb1484aecd82b52ce7a5d24292ac0718baea2fd7076bdbd WatchSource:0}: Error finding container 5b1d5c42f69aa47fdfb1484aecd82b52ce7a5d24292ac0718baea2fd7076bdbd: Status 404 returned error can't find the container with id 5b1d5c42f69aa47fdfb1484aecd82b52ce7a5d24292ac0718baea2fd7076bdbd Apr 16 20:11:57.409253 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.409237 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4llv8" Apr 16 20:11:57.414705 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.414684 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6321d139_42ba_4ad4_96d3_6dafabbdc869.slice/crio-f52fa79158a721ee6ecc8438d24c3ed6b464dcce8ad63fe07692d6ff12c19687 WatchSource:0}: Error finding container f52fa79158a721ee6ecc8438d24c3ed6b464dcce8ad63fe07692d6ff12c19687: Status 404 returned error can't find the container with id f52fa79158a721ee6ecc8438d24c3ed6b464dcce8ad63fe07692d6ff12c19687 Apr 16 20:11:57.422044 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.422029 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s4bbh" Apr 16 20:11:57.427227 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.427205 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae5927d_11b8_46a5_a3e9_c3be8d357974.slice/crio-b1f5aead251ee43800f0eb2dc0d8ef549d585a7cd7235c0d08a361b58c2cf2ab WatchSource:0}: Error finding container b1f5aead251ee43800f0eb2dc0d8ef549d585a7cd7235c0d08a361b58c2cf2ab: Status 404 returned error can't find the container with id b1f5aead251ee43800f0eb2dc0d8ef549d585a7cd7235c0d08a361b58c2cf2ab Apr 16 20:11:57.435772 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.435756 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:57.436515 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.436489 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:11:57.441891 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.441868 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90e7ceb_6160_4490_81d3_0bf334a5861e.slice/crio-fed6e31d47c1808bd0ea8b7731303687cb4f1b4a3e5fef297051331673c57078 WatchSource:0}: Error finding container fed6e31d47c1808bd0ea8b7731303687cb4f1b4a3e5fef297051331673c57078: Status 404 returned error can't find the container with id fed6e31d47c1808bd0ea8b7731303687cb4f1b4a3e5fef297051331673c57078 Apr 16 20:11:57.458823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.458806 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qmblv" Apr 16 20:11:57.463867 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.463848 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88690dc9_b75a_4009_b53f_717dd6e43bda.slice/crio-288cdd3cb9278ab14b791b8ede8958d17fbe9b395be27455ae63e038bdade7af WatchSource:0}: Error finding container 288cdd3cb9278ab14b791b8ede8958d17fbe9b395be27455ae63e038bdade7af: Status 404 returned error can't find the container with id 288cdd3cb9278ab14b791b8ede8958d17fbe9b395be27455ae63e038bdade7af Apr 16 20:11:57.468677 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.468661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:11:57.473902 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:11:57.473881 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d24ecb8_b036_4a27_8ff3_1283740a16d5.slice/crio-9b3d4e7776801b6d55b4e7ddfaead22c87a76319272f370e779258778cce7a46 WatchSource:0}: Error finding container 9b3d4e7776801b6d55b4e7ddfaead22c87a76319272f370e779258778cce7a46: Status 404 returned error can't find the container with id 9b3d4e7776801b6d55b4e7ddfaead22c87a76319272f370e779258778cce7a46 Apr 16 20:11:57.766306 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.765858 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:57.766306 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.765988 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.766306 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.766052 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.766032421 +0000 UTC m=+3.205677411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:57.867036 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.866679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:57.867036 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.867035 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:57.867263 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.867059 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:57.867263 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.867072 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.867263 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:57.867131 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.867111553 +0000 UTC m=+3.306756512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:57.873840 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.873819 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:57.948397 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.948362 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kjqkq"] Apr 16 20:11:57.949149 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.949115 2566 predicate.go:212] "Predicate failed on Pod" pod="kube-system/global-pull-secret-syncer-kjqkq" err="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 20:11:57.949149 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.949137 2566 kubelet.go:2420] "Pod admission denied" podUID="cb0b95ee-b13b-466e-b8df-b107ca158311" pod="kube-system/global-pull-secret-syncer-kjqkq" reason="NodeAffinity" message="Predicate NodeAffinity failed: node(s) didn't match Pod's node affinity/selector" Apr 16 20:11:57.965450 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.965398 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kjqkq" podStartSLOduration=0.965382767 podStartE2EDuration="965.382767ms" podCreationTimestamp="2026-04-16 20:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:57.964308094 +0000 UTC m=+2.403953060" watchObservedRunningTime="2026-04-16 20:11:57.965382767 +0000 UTC m=+2.405027733" Apr 16 20:11:57.967186 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.967136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:57.967302 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.967193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:57.967302 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:57.967264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.067753 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.067669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.067753 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.067740 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.067958 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.067819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.067958 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.067913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.068066 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.068045 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.068371 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.068138 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.068371 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.068198 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret podName:cb0b95ee-b13b-466e-b8df-b107ca158311 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.568180008 +0000 UTC m=+3.007824962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret") pod "global-pull-secret-syncer-kjqkq" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.098716 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.098675 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:57 +0000 UTC" deadline="2027-12-17 20:19:20.040761941 +0000 UTC" Apr 16 20:11:58.098716 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.098719 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14640h7m21.942046732s" Apr 16 20:11:58.270865 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.270836 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config\") pod \"cb0b95ee-b13b-466e-b8df-b107ca158311\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " Apr 16 20:11:58.271045 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.270880 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus\") pod \"cb0b95ee-b13b-466e-b8df-b107ca158311\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " Apr 16 20:11:58.271111 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.271083 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus" (OuterVolumeSpecName: "dbus") pod "cb0b95ee-b13b-466e-b8df-b107ca158311" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311"). InnerVolumeSpecName "dbus". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 16 20:11:58.271203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.271185 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config" (OuterVolumeSpecName: "kubelet-config") pod "cb0b95ee-b13b-466e-b8df-b107ca158311" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311"). InnerVolumeSpecName "kubelet-config". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Apr 16 20:11:58.272234 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.272178 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"9b3d4e7776801b6d55b4e7ddfaead22c87a76319272f370e779258778cce7a46"} Apr 16 20:11:58.289424 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.289391 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qmblv" event={"ID":"88690dc9-b75a-4009-b53f-717dd6e43bda","Type":"ContainerStarted","Data":"288cdd3cb9278ab14b791b8ede8958d17fbe9b395be27455ae63e038bdade7af"} Apr 16 20:11:58.299529 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.299480 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerStarted","Data":"f52fa79158a721ee6ecc8438d24c3ed6b464dcce8ad63fe07692d6ff12c19687"} Apr 16 20:11:58.309800 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.309762 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q76dr" event={"ID":"87948db0-f0f9-46ff-ad52-0b6cb7a17f42","Type":"ContainerStarted","Data":"89c154c08b906bcca73c95ca2387fe0476a9b087d9ad406efb2ab6c97459ca68"} Apr 16 20:11:58.327727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.327667 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:58.335110 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.335046 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kzrvj" event={"ID":"a90e7ceb-6160-4490-81d3-0bf334a5861e","Type":"ContainerStarted","Data":"fed6e31d47c1808bd0ea8b7731303687cb4f1b4a3e5fef297051331673c57078"} Apr 16 20:11:58.355665 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.355638 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4bbh" event={"ID":"aae5927d-11b8-46a5-a3e9-c3be8d357974","Type":"ContainerStarted","Data":"b1f5aead251ee43800f0eb2dc0d8ef549d585a7cd7235c0d08a361b58c2cf2ab"} Apr 16 20:11:58.372050 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.371997 2566 reconciler_common.go:299] "Volume detached for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-kubelet-config\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:11:58.372050 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.372022 2566 reconciler_common.go:299] "Volume detached for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cb0b95ee-b13b-466e-b8df-b107ca158311-dbus\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:11:58.373100 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.372990 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5wvp4" event={"ID":"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b","Type":"ContainerStarted","Data":"5b1d5c42f69aa47fdfb1484aecd82b52ce7a5d24292ac0718baea2fd7076bdbd"} Apr 16 20:11:58.406072 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.405981 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68ppm" event={"ID":"3b621d74-7f5b-47e2-afbb-a2c1610adc49","Type":"ContainerStarted","Data":"ca4ab9dccf46ca83232a101ce183a456a5cfe7a3e72c0f4c10809bd0e13f2e8b"} Apr 16 20:11:58.408346 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.408321 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" event={"ID":"6789abf2-1a59-4d55-9f2d-c976b4762dab","Type":"ContainerStarted","Data":"9a74281575ee326081fabe040dda4ccc8e6ba22190b14b64a9638b0dd1d4bfe3"} Apr 16 20:11:58.573844 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.573809 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:58.574016 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.573986 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.574076 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.574047 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret podName:cb0b95ee-b13b-466e-b8df-b107ca158311 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:59.574029501 +0000 UTC m=+4.013674451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret") pod "global-pull-secret-syncer-kjqkq" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.776083 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.776004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:58.776236 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.776200 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.776288 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.776260 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.776241364 +0000 UTC m=+5.215886311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:58.876860 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.876822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:58.877052 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.877035 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:58.877171 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.877067 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:58.877171 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.877081 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.877269 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:58.877180 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.877159143 +0000 UTC m=+5.316804101 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:58.994887 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:58.994858 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:59.098940 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:59.098851 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:57 +0000 UTC" deadline="2027-12-21 00:07:30.171915781 +0000 UTC" Apr 16 20:11:59.098940 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:59.098889 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14715h55m31.073030242s" Apr 16 20:11:59.244491 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:59.244451 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:11:59.244717 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:59.244672 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:11:59.244807 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:59.244745 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:11:59.244859 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:59.244834 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:11:59.583436 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:11:59.583354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:11:59.583655 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:59.583516 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:59.583655 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:11:59.583576 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret podName:cb0b95ee-b13b-466e-b8df-b107ca158311 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.583559092 +0000 UTC m=+6.023204039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret") pod "global-pull-secret-syncer-kjqkq" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:00.793167 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:00.792950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:00.793668 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.793219 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:00.793668 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.793294 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:04.793274059 +0000 UTC m=+9.232919022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:00.893748 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:00.893654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:00.893924 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.893809 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:00.893924 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.893828 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:00.893924 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.893841 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:00.893924 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:00.893902 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:04.893883685 +0000 UTC m=+9.333528631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.244528 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:01.244448 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:01.244712 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:01.244448 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:01.244712 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:01.244577 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:01.244712 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:01.244679 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:01.600347 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:01.600265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:12:01.600502 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:01.600399 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:01.600502 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:01.600460 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret podName:cb0b95ee-b13b-466e-b8df-b107ca158311 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:05.600442658 +0000 UTC m=+10.040087604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret") pod "global-pull-secret-syncer-kjqkq" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:03.243753 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:03.243717 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:03.244217 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:03.243830 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:03.244217 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:03.244096 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:03.244330 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:03.244218 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:04.827934 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:04.827869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:04.828497 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.828009 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:04.828497 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.828086 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:12.828063056 +0000 UTC m=+17.267708020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:04.928583 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:04.928326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:04.928742 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.928474 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:04.928742 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.928670 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:04.928742 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.928685 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:04.928742 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:04.928739 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:12.928720402 +0000 UTC m=+17.368365357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:05.244051 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:05.243970 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:05.244051 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:05.244000 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:05.244266 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:05.244095 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:05.244266 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:05.244248 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:05.633866 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:05.633780 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") pod \"global-pull-secret-syncer-kjqkq\" (UID: \"cb0b95ee-b13b-466e-b8df-b107ca158311\") " pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:12:05.634019 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:05.633943 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:05.634083 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:05.634017 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret podName:cb0b95ee-b13b-466e-b8df-b107ca158311 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:13.633995878 +0000 UTC m=+18.073640826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret") pod "global-pull-secret-syncer-kjqkq" (UID: "cb0b95ee-b13b-466e-b8df-b107ca158311") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:06.279437 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.279397 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kube-system/global-pull-secret-syncer-kjqkq"] Apr 16 20:12:06.279887 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.279501 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjqkq" Apr 16 20:12:06.283532 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.283505 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kube-system/global-pull-secret-syncer-kjqkq"] Apr 16 20:12:06.286304 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.286277 2566 status_manager.go:895] "Failed to get status for pod" podUID="cb0b95ee-b13b-466e-b8df-b107ca158311" pod="kube-system/global-pull-secret-syncer-kjqkq" err="pods \"global-pull-secret-syncer-kjqkq\" is forbidden: User \"system:node:ip-10-0-142-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-142-60.ec2.internal' and this object" Apr 16 20:12:06.310947 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.309962 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kpb78"] Apr 16 20:12:06.315194 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.315117 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.315322 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:06.315190 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:06.331211 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.331174 2566 status_manager.go:895] "Failed to get status for pod" podUID="cb0b95ee-b13b-466e-b8df-b107ca158311" pod="kube-system/global-pull-secret-syncer-kjqkq" err="pods \"global-pull-secret-syncer-kjqkq\" is forbidden: User \"system:node:ip-10-0-142-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-142-60.ec2.internal' and this object" Apr 16 20:12:06.338532 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.338510 2566 reconciler_common.go:299] "Volume detached for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cb0b95ee-b13b-466e-b8df-b107ca158311-original-pull-secret\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:12:06.439795 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.439755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.439950 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.439859 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-kubelet-config\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.439950 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.439906 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-dbus\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541021 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.540867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-kubelet-config\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541021 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.540916 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-dbus\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541021 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.540973 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541021 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.540985 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-kubelet-config\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541334 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:06.541065 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8a902087-f546-42a1-b9a5-96dab151ae99-dbus\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:06.541334 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:06.541104 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:06.541334 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:06.541165 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:07.041146848 +0000 UTC m=+11.480791794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:07.045537 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:07.045505 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:07.045718 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:07.045652 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:07.045802 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:07.045718 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.045700242 +0000 UTC m=+12.485345216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:07.244272 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:07.244248 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:07.244363 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:07.244261 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:07.244363 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:07.244353 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:07.244480 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:07.244455 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:08.053876 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:08.053842 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:08.054342 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:08.053975 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:08.054342 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:08.054035 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:10.054020724 +0000 UTC m=+14.493665675 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:08.244318 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:08.244282 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:08.244497 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:08.244397 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:09.244093 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:09.244063 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:09.244093 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:09.244085 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:09.244443 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:09.244182 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:09.244443 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:09.244296 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:10.071208 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:10.071165 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:10.071384 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:10.071323 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:10.071458 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:10.071400 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:14.071378404 +0000 UTC m=+18.511023354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:10.244532 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:10.244500 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:10.244977 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:10.244621 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:11.244433 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:11.244397 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:11.244642 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:11.244519 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:11.244642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:11.244586 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:11.245037 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:11.244721 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:12.246270 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:12.246244 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:12.246696 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.246357 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:12.892529 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:12.892493 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:12.892753 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.892672 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:12.892753 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.892742 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.892725855 +0000 UTC m=+33.332370801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:12.993784 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:12.993753 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:12.993946 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.993928 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:12.994004 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.993951 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:12.994004 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.993960 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:12.994075 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:12.994018 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.993999618 +0000 UTC m=+33.433644576 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:13.244441 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:13.244369 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:13.244441 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:13.244384 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:13.244656 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:13.244491 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:13.244711 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:13.244664 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:14.104008 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:14.103967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:14.104491 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:14.104095 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:14.104491 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:14.104167 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.104148018 +0000 UTC m=+26.543793003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:14.244358 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:14.244320 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:14.244537 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:14.244453 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:15.244306 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:15.244278 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:15.244683 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:15.244283 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:15.244683 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:15.244370 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:15.244683 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:15.244451 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:16.244932 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.244575 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:16.245728 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:16.245064 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:16.456455 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.456423 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" event={"ID":"4950d98dd76e149e921ac7e36cab051b","Type":"ContainerStarted","Data":"7b2d723415a9a19da4dde659969b6fbf8aab92acc4bcdf8cd90331b1a66653d8"} Apr 16 20:12:16.459233 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.458737 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s4bbh" event={"ID":"aae5927d-11b8-46a5-a3e9-c3be8d357974","Type":"ContainerStarted","Data":"4c283795cdc0aaa38234347b7f28e19df287d21980d4d9784ae4751ef7e902f3"} Apr 16 20:12:16.462922 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.462851 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-68ppm" event={"ID":"3b621d74-7f5b-47e2-afbb-a2c1610adc49","Type":"ContainerStarted","Data":"d751bf0b2394b14b2c466205d18e6605a8621b9466d2102d7adc898f0cc96fa5"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470653 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"e5a2b2201e4cc98c11c32ee1485a7fcf570444df5e55cadbdbc4de2767dbe75b"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470684 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"0f3608415e4a70e990b1c7bba240e69ae25c9b942260e02d08a79cf371ae0ef0"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470698 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"b856410cc4376fb851c1e66e6994d67e1122f76667fe72643263eb16ce4c56bf"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470712 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"fbde3677347c6082ddea16582e3477160fb36cb80da431001e3273a13e1ba1fb"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470725 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"8c6ba9b769fa8ca152a94d72af5b0380a706c4bf4c2dc9070e99523ae5a59d90"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.470739 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"aaf315f4961032aaff4d8fd7c11eb8041c8de677e56e96cc8fba4057c35d31a1"} Apr 16 20:12:16.471503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.471477 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-60.ec2.internal" podStartSLOduration=20.471461272 podStartE2EDuration="20.471461272s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:16.470755379 +0000 UTC m=+20.910400344" watchObservedRunningTime="2026-04-16 20:12:16.471461272 +0000 UTC m=+20.911106236" Apr 16 20:12:16.488176 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.488136 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-68ppm" podStartSLOduration=2.468460138 podStartE2EDuration="20.488124326s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.392271046 +0000 UTC m=+1.831915988" lastFinishedPulling="2026-04-16 20:12:15.41193522 +0000 UTC m=+19.851580176" observedRunningTime="2026-04-16 20:12:16.487837296 +0000 UTC m=+20.927482270" watchObservedRunningTime="2026-04-16 20:12:16.488124326 +0000 UTC m=+20.927769290" Apr 16 20:12:16.507963 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:16.506663 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s4bbh" podStartSLOduration=2.494962546 podStartE2EDuration="20.50664864s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.428548629 +0000 UTC m=+1.868193571" lastFinishedPulling="2026-04-16 20:12:15.440234719 +0000 UTC m=+19.879879665" observedRunningTime="2026-04-16 20:12:16.506647897 +0000 UTC m=+20.946292853" watchObservedRunningTime="2026-04-16 20:12:16.50664864 +0000 UTC m=+20.946293602" Apr 16 20:12:17.244303 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.244097 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:17.244426 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.244096 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:17.244426 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:17.244351 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:17.244426 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:17.244401 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:17.387690 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.387668 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:17.473452 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.473416 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qmblv" event={"ID":"88690dc9-b75a-4009-b53f-717dd6e43bda","Type":"ContainerStarted","Data":"86f286fbadd76b9c6b23be5ca36214e03bea34679eea54b19e4277b6d33b85bd"} Apr 16 20:12:17.477436 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.477409 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="c7f1023a2cbf8a7149a790d24e4f231401e684eb011f22acca1a549f706381e1" exitCode=0 Apr 16 20:12:17.477551 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.477476 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"c7f1023a2cbf8a7149a790d24e4f231401e684eb011f22acca1a549f706381e1"} Apr 16 20:12:17.478836 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.478778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q76dr" event={"ID":"87948db0-f0f9-46ff-ad52-0b6cb7a17f42","Type":"ContainerStarted","Data":"c8ac13716d2a151aef46e20644a8f0bc89d1376d465f803194377c43ff9188c6"} Apr 16 20:12:17.480081 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.479977 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kzrvj" event={"ID":"a90e7ceb-6160-4490-81d3-0bf334a5861e","Type":"ContainerStarted","Data":"09d96203c487a53769dbd9ca7f6cfb9b953141a5b05fe5d28840b06495cd7054"} Apr 16 20:12:17.481202 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.481184 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5wvp4" event={"ID":"3fe6aa55-9c5e-4ed7-bd31-4790e51c271b","Type":"ContainerStarted","Data":"a6807dd93117d0a44f9d242691041168d01cea3697c521847426678101468ea4"} Apr 16 20:12:17.482642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.482622 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" event={"ID":"6789abf2-1a59-4d55-9f2d-c976b4762dab","Type":"ContainerStarted","Data":"d29ad8eb013bed24fd527c305a251b935db14a833ef0b9ced02c4743720a67b9"} Apr 16 20:12:17.482727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.482647 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" event={"ID":"6789abf2-1a59-4d55-9f2d-c976b4762dab","Type":"ContainerStarted","Data":"0e3ebc8dcf56fd190b0a2a88e5fbf9e8efe1ba425d8da724c62679b3ba7554d8"} Apr 16 20:12:17.483883 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.483863 2566 generic.go:358] "Generic (PLEG): container finished" podID="ab040f816c3a4b4dcd7ff7115ed9725d" containerID="52119b674b5aee11b68e1e2be66a9a734d55bf892caba37dae364ae548615ba6" exitCode=0 Apr 16 20:12:17.484006 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.483986 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" event={"ID":"ab040f816c3a4b4dcd7ff7115ed9725d","Type":"ContainerDied","Data":"52119b674b5aee11b68e1e2be66a9a734d55bf892caba37dae364ae548615ba6"} Apr 16 20:12:17.508385 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.508341 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qmblv" podStartSLOduration=3.563600186 podStartE2EDuration="21.508326094s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.465243599 +0000 UTC m=+1.904888555" lastFinishedPulling="2026-04-16 20:12:15.409969505 +0000 UTC m=+19.849614463" observedRunningTime="2026-04-16 20:12:17.493648317 +0000 UTC m=+21.933293282" watchObservedRunningTime="2026-04-16 20:12:17.508326094 +0000 UTC m=+21.947971059" Apr 16 20:12:17.544825 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.544781 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q76dr" podStartSLOduration=3.543987907 podStartE2EDuration="21.544767183s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.408920845 +0000 UTC m=+1.848565787" lastFinishedPulling="2026-04-16 20:12:15.409700118 +0000 UTC m=+19.849345063" observedRunningTime="2026-04-16 20:12:17.544745648 +0000 UTC m=+21.984390612" watchObservedRunningTime="2026-04-16 20:12:17.544767183 +0000 UTC m=+21.984412147" Apr 16 20:12:17.574672 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.574626 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kzrvj" podStartSLOduration=3.6328719 podStartE2EDuration="21.574593466s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.443230803 +0000 UTC m=+1.882875744" lastFinishedPulling="2026-04-16 20:12:15.384952369 +0000 UTC m=+19.824597310" observedRunningTime="2026-04-16 20:12:17.574374646 +0000 UTC m=+22.014019612" watchObservedRunningTime="2026-04-16 20:12:17.574593466 +0000 UTC m=+22.014238431" Apr 16 20:12:17.575070 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:17.575046 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5wvp4" podStartSLOduration=3.574722979 podStartE2EDuration="21.575039644s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.409403102 +0000 UTC m=+1.849048058" lastFinishedPulling="2026-04-16 20:12:15.409719775 +0000 UTC m=+19.849364723" observedRunningTime="2026-04-16 20:12:17.560040073 +0000 UTC m=+21.999685041" watchObservedRunningTime="2026-04-16 20:12:17.575039644 +0000 UTC m=+22.014684608" Apr 16 20:12:18.130122 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.130015 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:17.387686581Z","UUID":"37dd55cc-edd0-4fd4-b9d6-846e8270017a","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:18.133150 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.133123 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:18.133150 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.133152 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:18.243893 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.243864 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:18.244069 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:18.243993 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:18.487686 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.487657 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" event={"ID":"6789abf2-1a59-4d55-9f2d-c976b4762dab","Type":"ContainerStarted","Data":"95f8db3c58afa446b883c910fc0911371fa00f2ec7e99dd85d87b97c38e4a189"} Apr 16 20:12:18.489623 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.489582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" event={"ID":"ab040f816c3a4b4dcd7ff7115ed9725d","Type":"ContainerStarted","Data":"d18655b3cfe3ba2e8f0112c261c07d414727c4bc00ec25fa2f3aa1af7bf188f7"} Apr 16 20:12:18.523754 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.523714 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r2gjp" podStartSLOduration=1.56174827 podStartE2EDuration="22.523701982s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.37856178 +0000 UTC m=+1.818206722" lastFinishedPulling="2026-04-16 20:12:18.340515492 +0000 UTC m=+22.780160434" observedRunningTime="2026-04-16 20:12:18.508374608 +0000 UTC m=+22.948019572" watchObservedRunningTime="2026-04-16 20:12:18.523701982 +0000 UTC m=+22.963346946" Apr 16 20:12:18.524131 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:18.524107 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-60.ec2.internal" podStartSLOduration=22.524099958 podStartE2EDuration="22.524099958s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:18.523826724 +0000 UTC m=+22.963471667" watchObservedRunningTime="2026-04-16 20:12:18.524099958 +0000 UTC m=+22.963744922" Apr 16 20:12:19.243859 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:19.243829 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:19.243859 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:19.243830 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:19.244099 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:19.243961 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:19.244145 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:19.244101 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:19.495176 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:19.495053 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"1ee716360c54bf12ffb5e5c60002e0a4cbd7838579eed1d6b834b90acbc33a17"} Apr 16 20:12:20.244503 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:20.244473 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:20.244737 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:20.244594 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:21.243854 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:21.243815 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:21.244363 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:21.243823 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:21.244363 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:21.243925 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:21.244363 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:21.244037 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:21.862506 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:21.862309 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:12:21.863045 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:21.863011 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:12:22.164507 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.164418 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:22.164668 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:22.164552 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:22.164668 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:22.164628 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret podName:8a902087-f546-42a1-b9a5-96dab151ae99 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:38.164594997 +0000 UTC m=+42.604239952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret") pod "global-pull-secret-syncer-kpb78" (UID: "8a902087-f546-42a1-b9a5-96dab151ae99") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:22.244123 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.244097 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:22.244743 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:22.244203 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:22.506381 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.506345 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" event={"ID":"9d24ecb8-b036-4a27-8ff3-1283740a16d5","Type":"ContainerStarted","Data":"9811b7808f6fc5dd133bc39ce63102bafb5a7d01e6f36d39623e4cb848a80baa"} Apr 16 20:12:22.506695 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.506663 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:12:22.506823 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.506708 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:12:22.507943 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.507921 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="fdaa7d7b49b97da6fbcfba7857a06fb9decac473206ed8e5ef77fed074359f02" exitCode=0 Apr 16 20:12:22.508051 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.507950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"fdaa7d7b49b97da6fbcfba7857a06fb9decac473206ed8e5ef77fed074359f02"} Apr 16 20:12:22.508281 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.508266 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:12:22.508640 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.508621 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kzrvj" Apr 16 20:12:22.521781 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.521763 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:12:22.531133 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:22.531100 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" podStartSLOduration=8.051933511 podStartE2EDuration="26.531090223s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.475233248 +0000 UTC m=+1.914878190" lastFinishedPulling="2026-04-16 20:12:15.954389958 +0000 UTC m=+20.394034902" observedRunningTime="2026-04-16 20:12:22.529731989 +0000 UTC m=+26.969376953" watchObservedRunningTime="2026-04-16 20:12:22.531090223 +0000 UTC m=+26.970735212" Apr 16 20:12:23.243964 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.243940 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:23.244132 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.243945 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:23.244132 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:23.244039 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:23.244485 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:23.244157 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:23.437735 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.437674 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tkpmf"] Apr 16 20:12:23.440903 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.440872 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kpb78"] Apr 16 20:12:23.441030 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.440990 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:23.441107 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:23.441086 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:23.441495 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.441470 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jdfnl"] Apr 16 20:12:23.511892 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.511868 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="75d7b353b80707088d09705b181dea5f50bf9e127ca25c82bce6bfdc05a5b05e" exitCode=0 Apr 16 20:12:23.512013 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.511934 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:23.512013 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.511975 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"75d7b353b80707088d09705b181dea5f50bf9e127ca25c82bce6bfdc05a5b05e"} Apr 16 20:12:23.512229 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:23.512196 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:23.512765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.512712 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:23.512832 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:23.512811 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:23.512882 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.512831 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:12:23.526985 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:23.526966 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:12:24.515288 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:24.515053 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="dd154ce8f03e9d60e43fe0310fbaa527382342deb55cef0ee91906014509cee6" exitCode=0 Apr 16 20:12:24.515659 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:24.515129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"dd154ce8f03e9d60e43fe0310fbaa527382342deb55cef0ee91906014509cee6"} Apr 16 20:12:25.244509 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:25.244482 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:25.244686 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:25.244537 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:25.244686 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:25.244563 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:25.244686 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:25.244664 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:25.245005 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:25.244967 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:25.245123 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:25.245073 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:27.244059 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:27.244033 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:27.244717 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:27.244033 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:27.244717 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:27.244150 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:12:27.244717 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:27.244172 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:27.244717 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:27.244255 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kpb78" podUID="8a902087-f546-42a1-b9a5-96dab151ae99" Apr 16 20:12:27.244717 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:27.244325 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tkpmf" podUID="3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38" Apr 16 20:12:28.862785 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.862758 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-60.ec2.internal" event="NodeReady" Apr 16 20:12:28.863380 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.862894 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:28.906152 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.906122 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k5t5f"] Apr 16 20:12:28.910820 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.910794 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:28.910958 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:28.910934 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:28.911018 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:28.910984 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:13:00.910971668 +0000 UTC m=+65.350616614 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:28.942547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.942521 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t7fts"] Apr 16 20:12:28.942704 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.942678 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:28.945057 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.944886 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:12:28.945057 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.944890 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:28.945057 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.944933 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:28.958292 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.958274 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k5t5f"] Apr 16 20:12:28.958292 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.958294 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7fts"] Apr 16 20:12:28.958454 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.958415 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:28.961880 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.961355 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:28.961880 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.961524 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:12:28.961880 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.961846 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:28.964385 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:28.963143 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:29.011689 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.011662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.011827 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.011726 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:29.011827 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.011795 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e95540-055f-454d-b85c-31093fbd7bbf-tmp-dir\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.011833 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96e95540-055f-454d-b85c-31093fbd7bbf-config-volume\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.011838 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.011862 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.011874 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jhrfz for pod openshift-network-diagnostics/network-check-target-tkpmf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.011901 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vv67\" (UniqueName: \"kubernetes.io/projected/96e95540-055f-454d-b85c-31093fbd7bbf-kube-api-access-5vv67\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.011938 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.011927 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz podName:3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:01.011909125 +0000 UTC m=+65.451554081 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jhrfz" (UniqueName: "kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz") pod "network-check-target-tkpmf" (UID: "3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:29.112487 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.112487 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112494 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vv67\" (UniqueName: \"kubernetes.io/projected/96e95540-055f-454d-b85c-31093fbd7bbf-kube-api-access-5vv67\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.112716 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112521 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.112716 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.112645 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:29.112716 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e95540-055f-454d-b85c-31093fbd7bbf-tmp-dir\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.112716 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112697 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhc6\" (UniqueName: \"kubernetes.io/projected/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-kube-api-access-2dhc6\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.112716 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.112712 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.612692847 +0000 UTC m=+34.052337803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:29.112946 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.112805 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96e95540-055f-454d-b85c-31093fbd7bbf-config-volume\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.113022 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.113004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e95540-055f-454d-b85c-31093fbd7bbf-tmp-dir\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.113308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.113292 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/96e95540-055f-454d-b85c-31093fbd7bbf-config-volume\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.123189 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.123166 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vv67\" (UniqueName: \"kubernetes.io/projected/96e95540-055f-454d-b85c-31093fbd7bbf-kube-api-access-5vv67\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.213840 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.213807 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhc6\" (UniqueName: \"kubernetes.io/projected/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-kube-api-access-2dhc6\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.213988 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.213854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.213988 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.213954 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:29.214084 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.214025 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:12:29.714008619 +0000 UTC m=+34.153653565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:29.222373 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.222347 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhc6\" (UniqueName: \"kubernetes.io/projected/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-kube-api-access-2dhc6\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.244041 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.244016 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:12:29.244041 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.244032 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:12:29.244202 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.244016 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:29.246751 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246544 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:29.246751 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246558 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:29.246751 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246545 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:29.246751 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246689 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8rq6t\"" Apr 16 20:12:29.246751 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246711 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:29.247006 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.246958 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:12:29.617088 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.617036 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:29.617261 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.617192 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:29.617341 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.617265 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.617246862 +0000 UTC m=+35.056891827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:29.717767 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:29.717721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:29.717949 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.717889 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:29.718016 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:29.717962 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.717940718 +0000 UTC m=+35.157585674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:30.624507 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:30.624484 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:30.624969 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:30.624677 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:30.624969 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:30.624750 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:32.624731479 +0000 UTC m=+37.064376436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:30.725596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:30.725511 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:30.725758 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:30.725673 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:30.725758 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:30.725729 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:12:32.725714561 +0000 UTC m=+37.165359504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:31.530227 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:31.530196 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="3a8d995c49da386aebcbc2a85bc2ecca6736c418938c1175d0c8c78031d29d44" exitCode=0 Apr 16 20:12:31.530402 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:31.530238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"3a8d995c49da386aebcbc2a85bc2ecca6736c418938c1175d0c8c78031d29d44"} Apr 16 20:12:32.534319 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:32.534132 2566 generic.go:358] "Generic (PLEG): container finished" podID="6321d139-42ba-4ad4-96d3-6dafabbdc869" containerID="8a2c8f4960a70c5823e8f2b1135a8cdc66b0daf3923cadf9a7e3b5ce251fa266" exitCode=0 Apr 16 20:12:32.534662 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:32.534211 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerDied","Data":"8a2c8f4960a70c5823e8f2b1135a8cdc66b0daf3923cadf9a7e3b5ce251fa266"} Apr 16 20:12:32.642468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:32.642447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:32.642655 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:32.642628 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:32.642768 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:32.642713 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:36.642687705 +0000 UTC m=+41.082332671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:32.742967 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:32.742943 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:32.743088 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:32.743074 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:32.743147 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:32.743138 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:12:36.743124545 +0000 UTC m=+41.182769491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:33.538718 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:33.538683 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4llv8" event={"ID":"6321d139-42ba-4ad4-96d3-6dafabbdc869","Type":"ContainerStarted","Data":"f457ca1e018b1113573546fd38b9037d64e260c5deb9863be8783ed59809ed36"} Apr 16 20:12:33.565306 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:33.565260 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4llv8" podStartSLOduration=4.526623038 podStartE2EDuration="37.565246862s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:11:57.416120376 +0000 UTC m=+1.855765318" lastFinishedPulling="2026-04-16 20:12:30.454744201 +0000 UTC m=+34.894389142" observedRunningTime="2026-04-16 20:12:33.564009906 +0000 UTC m=+38.003654869" watchObservedRunningTime="2026-04-16 20:12:33.565246862 +0000 UTC m=+38.004891827" Apr 16 20:12:36.670294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:36.670264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:36.670734 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:36.670366 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:36.670734 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:36.670421 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:12:44.670408543 +0000 UTC m=+49.110053498 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:36.771284 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:36.771256 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:36.771387 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:36.771373 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:36.771427 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:36.771423 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:12:44.771407415 +0000 UTC m=+49.211052374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:38.179332 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:38.179292 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:38.182459 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:38.182438 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8a902087-f546-42a1-b9a5-96dab151ae99-original-pull-secret\") pod \"global-pull-secret-syncer-kpb78\" (UID: \"8a902087-f546-42a1-b9a5-96dab151ae99\") " pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:38.262292 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:38.262267 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kpb78" Apr 16 20:12:38.407034 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:38.407005 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kpb78"] Apr 16 20:12:38.412131 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:12:38.412104 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a902087_f546_42a1_b9a5_96dab151ae99.slice/crio-308ca02293ff01b768882e6916cd6bbf028da87ee7f73b209bbd262a7eca5bf5 WatchSource:0}: Error finding container 308ca02293ff01b768882e6916cd6bbf028da87ee7f73b209bbd262a7eca5bf5: Status 404 returned error can't find the container with id 308ca02293ff01b768882e6916cd6bbf028da87ee7f73b209bbd262a7eca5bf5 Apr 16 20:12:38.548319 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:38.548280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kpb78" event={"ID":"8a902087-f546-42a1-b9a5-96dab151ae99","Type":"ContainerStarted","Data":"308ca02293ff01b768882e6916cd6bbf028da87ee7f73b209bbd262a7eca5bf5"} Apr 16 20:12:42.556810 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:42.556776 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kpb78" event={"ID":"8a902087-f546-42a1-b9a5-96dab151ae99","Type":"ContainerStarted","Data":"c4ec3ba22f934872c1b75cb9a2580b880f053be429d41328292f3ce35aa1df91"} Apr 16 20:12:42.575055 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:42.575014 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kpb78" podStartSLOduration=32.6149107 podStartE2EDuration="36.575002206s" podCreationTimestamp="2026-04-16 20:12:06 +0000 UTC" firstStartedPulling="2026-04-16 20:12:38.413728965 +0000 UTC m=+42.853373907" lastFinishedPulling="2026-04-16 20:12:42.373820466 +0000 UTC m=+46.813465413" observedRunningTime="2026-04-16 20:12:42.574939259 +0000 UTC m=+47.014584261" watchObservedRunningTime="2026-04-16 20:12:42.575002206 +0000 UTC m=+47.014647164" Apr 16 20:12:44.726844 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:44.726807 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:12:44.727214 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:44.726950 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:44.727214 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:44.727010 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:13:00.726994062 +0000 UTC m=+65.166639009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:12:44.827543 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:44.827518 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:12:44.827674 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:44.827650 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:44.827714 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:12:44.827694 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:13:00.827682604 +0000 UTC m=+65.267327546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:12:55.527235 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:12:55.527208 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8lrq8" Apr 16 20:13:00.735106 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:00.735071 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:13:00.735536 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.735214 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:00.735536 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.735280 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:13:32.73526379 +0000 UTC m=+97.174908750 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:13:00.836108 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:00.836071 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:13:00.836242 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.836220 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:00.836331 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.836322 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:13:32.836305618 +0000 UTC m=+97.275950564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:13:00.937464 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:00.937434 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:13:00.939740 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:00.939724 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:13:00.948542 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.948519 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:00.948614 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:00.948578 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:04.948561362 +0000 UTC m=+129.388206310 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : secret "metrics-daemon-secret" not found Apr 16 20:13:01.038699 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.038679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:13:01.040954 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.040939 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:13:01.051134 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.051117 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:13:01.062460 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.062438 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38-kube-api-access-jhrfz\") pod \"network-check-target-tkpmf\" (UID: \"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38\") " pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:13:01.361793 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.361725 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8rq6t\"" Apr 16 20:13:01.366828 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.366814 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:13:01.498319 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.498287 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tkpmf"] Apr 16 20:13:01.501466 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:13:01.501437 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3713fd95_2eeb_4bfb_9189_fc6bf6ca4b38.slice/crio-f399cc0914f871ebed33b9f4c31f3da8c058bc0d2705d151e13bacfa48a34e24 WatchSource:0}: Error finding container f399cc0914f871ebed33b9f4c31f3da8c058bc0d2705d151e13bacfa48a34e24: Status 404 returned error can't find the container with id f399cc0914f871ebed33b9f4c31f3da8c058bc0d2705d151e13bacfa48a34e24 Apr 16 20:13:01.592225 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:01.592198 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tkpmf" event={"ID":"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38","Type":"ContainerStarted","Data":"f399cc0914f871ebed33b9f4c31f3da8c058bc0d2705d151e13bacfa48a34e24"} Apr 16 20:13:04.598986 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:04.598950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tkpmf" event={"ID":"3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38","Type":"ContainerStarted","Data":"b09623623fcbacf6cee64c8fc94751f7c46107dfd9f86681ab98b04dc74bbc80"} Apr 16 20:13:04.599353 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:04.599084 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:13:04.614467 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:04.614421 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tkpmf" podStartSLOduration=65.98900081 podStartE2EDuration="1m8.61440978s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:13:01.50344697 +0000 UTC m=+65.943091912" lastFinishedPulling="2026-04-16 20:13:04.128855939 +0000 UTC m=+68.568500882" observedRunningTime="2026-04-16 20:13:04.613214073 +0000 UTC m=+69.052859027" watchObservedRunningTime="2026-04-16 20:13:04.61440978 +0000 UTC m=+69.054054743" Apr 16 20:13:32.739884 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:32.739847 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:13:32.740338 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:32.739956 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:32.740338 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:32.740016 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls podName:96e95540-055f-454d-b85c-31093fbd7bbf nodeName:}" failed. No retries permitted until 2026-04-16 20:14:36.740002286 +0000 UTC m=+161.179647228 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls") pod "dns-default-k5t5f" (UID: "96e95540-055f-454d-b85c-31093fbd7bbf") : secret "dns-default-metrics-tls" not found Apr 16 20:13:32.840929 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:32.840894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:13:32.841026 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:32.841002 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:32.841062 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:13:32.841049 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert podName:42bbcb75-7cbe-482c-8c08-a9ceeb1c626d nodeName:}" failed. No retries permitted until 2026-04-16 20:14:36.841036746 +0000 UTC m=+161.280681688 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert") pod "ingress-canary-t7fts" (UID: "42bbcb75-7cbe-482c-8c08-a9ceeb1c626d") : secret "canary-serving-cert" not found Apr 16 20:13:35.603945 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:13:35.603911 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tkpmf" Apr 16 20:14:04.955851 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:04.955797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:14:04.956356 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:04.955940 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:14:04.956356 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:04.956015 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs podName:02d874be-6206-4feb-99d1-3539318d290b nodeName:}" failed. No retries permitted until 2026-04-16 20:16:06.95599944 +0000 UTC m=+251.395644382 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs") pod "network-metrics-daemon-jdfnl" (UID: "02d874be-6206-4feb-99d1-3539318d290b") : secret "metrics-daemon-secret" not found Apr 16 20:14:12.421761 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.421728 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj"] Apr 16 20:14:12.424210 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.424195 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" Apr 16 20:14:12.428651 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.428632 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.428760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.428637 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-f2w8g\"" Apr 16 20:14:12.429340 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.429323 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.435237 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.435216 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj"] Apr 16 20:14:12.503908 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.503876 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qbh\" (UniqueName: \"kubernetes.io/projected/0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7-kube-api-access-s9qbh\") pod \"volume-data-source-validator-7c6cbb6c87-j6vbj\" (UID: \"0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" Apr 16 20:14:12.521175 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.521146 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-q5swj"] Apr 16 20:14:12.523715 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.523697 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.528071 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.528053 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm"] Apr 16 20:14:12.530582 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.530564 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b64899dcd-25ctw"] Apr 16 20:14:12.530717 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.530700 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.531508 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.531489 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 20:14:12.531885 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.531869 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 20:14:12.533213 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.533063 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.578387 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.578366 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 20:14:12.583194 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.583181 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-h56nh\"" Apr 16 20:14:12.584850 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.584836 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.604540 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604519 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 20:14:12.604714 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604689 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-service-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.604760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k52fk\" (UniqueName: \"kubernetes.io/projected/3d1461df-f676-49a3-a685-4bddd22c8287-kube-api-access-k52fk\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.604760 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-snapshots\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.604822 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604770 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.604822 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604789 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvplp\" (UniqueName: \"kubernetes.io/projected/387f5caa-46e7-4c7e-9eb3-9fececd0858d-kube-api-access-cvplp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.604822 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604809 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2pn8\" (UniqueName: \"kubernetes.io/projected/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-kube-api-access-k2pn8\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.604940 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.604940 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qbh\" (UniqueName: \"kubernetes.io/projected/0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7-kube-api-access-s9qbh\") pod \"volume-data-source-validator-7c6cbb6c87-j6vbj\" (UID: \"0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" Apr 16 20:14:12.604940 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604930 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-tmp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.605061 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604949 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-default-certificate\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.605061 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.605061 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.604993 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/387f5caa-46e7-4c7e-9eb3-9fececd0858d-serving-cert\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.605061 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.605008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-stats-auth\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.605236 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.605103 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.605236 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.605138 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.609545 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.609526 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-q5swj"] Apr 16 20:14:12.610451 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.610435 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm"] Apr 16 20:14:12.611646 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.611632 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 20:14:12.612128 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.612112 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 20:14:12.612612 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.612587 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 20:14:12.612674 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.612627 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-pqdwz\"" Apr 16 20:14:12.612850 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.612836 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-q2jhz\"" Apr 16 20:14:12.612979 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.612962 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.613319 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.613304 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 20:14:12.613391 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.613331 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.613730 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.613715 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.613909 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.613894 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 20:14:12.614665 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.614646 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.614904 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.614890 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.622201 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.622182 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b64899dcd-25ctw"] Apr 16 20:14:12.628535 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.628517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qbh\" (UniqueName: \"kubernetes.io/projected/0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7-kube-api-access-s9qbh\") pod \"volume-data-source-validator-7c6cbb6c87-j6vbj\" (UID: \"0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" Apr 16 20:14:12.706354 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706299 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/387f5caa-46e7-4c7e-9eb3-9fececd0858d-serving-cert\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.706354 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706324 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-stats-auth\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.706354 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706347 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.706548 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.706548 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-service-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.706978 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.706953 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k52fk\" (UniqueName: \"kubernetes.io/projected/3d1461df-f676-49a3-a685-4bddd22c8287-kube-api-access-k52fk\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.707154 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707135 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-snapshots\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.707286 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707271 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.707411 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707396 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvplp\" (UniqueName: \"kubernetes.io/projected/387f5caa-46e7-4c7e-9eb3-9fececd0858d-kube-api-access-cvplp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.707530 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707513 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2pn8\" (UniqueName: \"kubernetes.io/projected/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-kube-api-access-k2pn8\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.707704 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.707844 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707795 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/387f5caa-46e7-4c7e-9eb3-9fececd0858d-service-ca-bundle\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.707844 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707692 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.707956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707902 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-tmp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.707956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707942 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-default-certificate\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.708051 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.707980 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.708102 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:12.708089 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:13.208067286 +0000 UTC m=+137.647712234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:12.708162 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:12.708129 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:12.708214 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:12.708202 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:13.208184918 +0000 UTC m=+137.647830101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:12.708462 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:12.708434 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:12.708567 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:12.708503 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:13.208489031 +0000 UTC m=+137.648133994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : secret "router-metrics-certs-default" not found Apr 16 20:14:12.709053 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.708713 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-snapshots\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.709053 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.709028 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/387f5caa-46e7-4c7e-9eb3-9fececd0858d-tmp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.709454 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.709429 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/387f5caa-46e7-4c7e-9eb3-9fececd0858d-serving-cert\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.711412 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.709557 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.711412 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.710015 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-stats-auth\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.711559 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.711417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-default-certificate\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.717126 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.717105 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k52fk\" (UniqueName: \"kubernetes.io/projected/3d1461df-f676-49a3-a685-4bddd22c8287-kube-api-access-k52fk\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:12.717215 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.717126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvplp\" (UniqueName: \"kubernetes.io/projected/387f5caa-46e7-4c7e-9eb3-9fececd0858d-kube-api-access-cvplp\") pod \"insights-operator-585dfdc468-q5swj\" (UID: \"387f5caa-46e7-4c7e-9eb3-9fececd0858d\") " pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.717215 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.717105 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2pn8\" (UniqueName: \"kubernetes.io/projected/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-kube-api-access-k2pn8\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:12.732143 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.732125 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" Apr 16 20:14:12.832055 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.832028 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-q5swj" Apr 16 20:14:12.845210 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.845186 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj"] Apr 16 20:14:12.848911 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:12.848854 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a787dc5_7cdd_40df_bd39_3b2a23ba5aa7.slice/crio-bfd206b4db326f859097b4d0bd57e3f7697760c8d21fdfbc5df8caadd4b8f36c WatchSource:0}: Error finding container bfd206b4db326f859097b4d0bd57e3f7697760c8d21fdfbc5df8caadd4b8f36c: Status 404 returned error can't find the container with id bfd206b4db326f859097b4d0bd57e3f7697760c8d21fdfbc5df8caadd4b8f36c Apr 16 20:14:12.944493 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.944464 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-q5swj"] Apr 16 20:14:12.947653 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:12.947632 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387f5caa_46e7_4c7e_9eb3_9fececd0858d.slice/crio-2668c45902131e261b7440e68435c31072183dc49214e6a2852a710b21394fad WatchSource:0}: Error finding container 2668c45902131e261b7440e68435c31072183dc49214e6a2852a710b21394fad: Status 404 returned error can't find the container with id 2668c45902131e261b7440e68435c31072183dc49214e6a2852a710b21394fad Apr 16 20:14:12.993168 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.993148 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:14:12.997131 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.997116 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:12.999766 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.999749 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:14:12.999952 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:12.999932 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kmdps\"" Apr 16 20:14:13.000214 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.000200 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:14:13.000661 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.000649 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:14:13.006364 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.006347 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:14:13.007514 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.007496 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:14:13.111438 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111405 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111621 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111621 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111478 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6q2\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111621 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111547 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111762 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111617 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111762 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111762 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111737 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.111865 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.111778 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212420 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212426 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6q2\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212451 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:13.212544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212554 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212626 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:14.212593205 +0000 UTC m=+138.652238147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212557 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212666 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212678 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f777c949-lbh9s: secret "image-registry-tls" not found Apr 16 20:14:13.212727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212727 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212752 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls podName:8034ee30-2d4d-4d63-b281-5171ce85150a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:13.712725207 +0000 UTC m=+138.152370165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls") pod "image-registry-55f777c949-lbh9s" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a") : secret "image-registry-tls" not found Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212785 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212804 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212833 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.212859 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:14.212841852 +0000 UTC m=+138.652486811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : secret "router-metrics-certs-default" not found Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:13.213063 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.212966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.213414 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.213092 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:14.213071078 +0000 UTC m=+138.652716044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:13.213414 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.213106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.213414 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.213358 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.213740 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.213718 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.215026 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.215008 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.215123 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.215046 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.221419 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.221391 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6q2\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.221635 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.221618 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.718124 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.718074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:13.718516 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.718222 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:13.718516 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.718238 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f777c949-lbh9s: secret "image-registry-tls" not found Apr 16 20:14:13.718516 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:13.718301 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls podName:8034ee30-2d4d-4d63-b281-5171ce85150a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:14.718282258 +0000 UTC m=+139.157927202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls") pod "image-registry-55f777c949-lbh9s" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a") : secret "image-registry-tls" not found Apr 16 20:14:13.731660 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.731630 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" event={"ID":"0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7","Type":"ContainerStarted","Data":"bfd206b4db326f859097b4d0bd57e3f7697760c8d21fdfbc5df8caadd4b8f36c"} Apr 16 20:14:13.732923 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:13.732889 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-q5swj" event={"ID":"387f5caa-46e7-4c7e-9eb3-9fececd0858d","Type":"ContainerStarted","Data":"2668c45902131e261b7440e68435c31072183dc49214e6a2852a710b21394fad"} Apr 16 20:14:14.221862 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.221837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:14.221968 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.221921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:14.222020 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.221998 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:16.221977579 +0000 UTC m=+140.661622545 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:14.222089 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.222034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:14.222089 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.222058 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:14.222176 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.222104 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:16.222089743 +0000 UTC m=+140.661734690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:14.222176 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.222141 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:14.222274 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.222192 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:16.222177736 +0000 UTC m=+140.661822685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : secret "router-metrics-certs-default" not found Apr 16 20:14:14.725735 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.725698 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:14.726076 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.725847 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:14.726076 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.725867 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f777c949-lbh9s: secret "image-registry-tls" not found Apr 16 20:14:14.726076 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:14.725922 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls podName:8034ee30-2d4d-4d63-b281-5171ce85150a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:16.725905728 +0000 UTC m=+141.165550670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls") pod "image-registry-55f777c949-lbh9s" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a") : secret "image-registry-tls" not found Apr 16 20:14:14.735583 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.735558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" event={"ID":"0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7","Type":"ContainerStarted","Data":"bd1302025bb506dba038cac3b4770b5915ac47cfab8ae7da2edd6b63f774868b"} Apr 16 20:14:14.751578 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:14.751538 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-j6vbj" podStartSLOduration=1.446320493 podStartE2EDuration="2.751523318s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:12.851149118 +0000 UTC m=+137.290794064" lastFinishedPulling="2026-04-16 20:14:14.156351946 +0000 UTC m=+138.595996889" observedRunningTime="2026-04-16 20:14:14.75135427 +0000 UTC m=+139.190999234" watchObservedRunningTime="2026-04-16 20:14:14.751523318 +0000 UTC m=+139.191168287" Apr 16 20:14:15.738994 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:15.738954 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-q5swj" event={"ID":"387f5caa-46e7-4c7e-9eb3-9fececd0858d","Type":"ContainerStarted","Data":"b4ccbc7a1b12bb97bd4d599c897bd5065ad1bef312a0ef0eb37a0b474d9967d5"} Apr 16 20:14:15.755631 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:15.755569 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-q5swj" podStartSLOduration=1.930103012 podStartE2EDuration="3.755552941s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:12.949368013 +0000 UTC m=+137.389012956" lastFinishedPulling="2026-04-16 20:14:14.774817941 +0000 UTC m=+139.214462885" observedRunningTime="2026-04-16 20:14:15.754833199 +0000 UTC m=+140.194478175" watchObservedRunningTime="2026-04-16 20:14:15.755552941 +0000 UTC m=+140.195197948" Apr 16 20:14:16.238776 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:16.238744 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:16.238916 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:16.238830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:16.238916 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.238904 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.238884519 +0000 UTC m=+144.678529466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:16.239048 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:16.238933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:16.239048 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.238979 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:16.239048 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.239023 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:16.239048 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.239042 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.239024637 +0000 UTC m=+144.678669596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:16.239197 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.239063 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.239052627 +0000 UTC m=+144.678697576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : secret "router-metrics-certs-default" not found Apr 16 20:14:16.743329 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:16.743302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:16.743705 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.743433 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:16.743705 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.743447 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f777c949-lbh9s: secret "image-registry-tls" not found Apr 16 20:14:16.743705 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:16.743496 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls podName:8034ee30-2d4d-4d63-b281-5171ce85150a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.743481162 +0000 UTC m=+145.183126119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls") pod "image-registry-55f777c949-lbh9s" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a") : secret "image-registry-tls" not found Apr 16 20:14:19.424779 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:19.424752 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q76dr_87948db0-f0f9-46ff-ad52-0b6cb7a17f42/dns-node-resolver/0.log" Apr 16 20:14:20.022870 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:20.022838 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5wvp4_3fe6aa55-9c5e-4ed7-bd31-4790e51c271b/node-ca/0.log" Apr 16 20:14:20.273397 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:20.273321 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:20.273397 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:20.273376 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:20.273425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.273460 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.273459 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.273518 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:28.273503847 +0000 UTC m=+152.713148788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.273557 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:28.273545154 +0000 UTC m=+152.713190096 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : configmap references non-existent config key: service-ca.crt Apr 16 20:14:20.273565 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.273567 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs podName:3d1461df-f676-49a3-a685-4bddd22c8287 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:28.273561557 +0000 UTC m=+152.713206499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs") pod "router-default-7b64899dcd-25ctw" (UID: "3d1461df-f676-49a3-a685-4bddd22c8287") : secret "router-metrics-certs-default" not found Apr 16 20:14:20.778047 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:20.778016 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:20.778410 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.778135 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:20.778410 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.778147 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f777c949-lbh9s: secret "image-registry-tls" not found Apr 16 20:14:20.778410 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:20.778194 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls podName:8034ee30-2d4d-4d63-b281-5171ce85150a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:28.77818078 +0000 UTC m=+153.217825724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls") pod "image-registry-55f777c949-lbh9s" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a") : secret "image-registry-tls" not found Apr 16 20:14:28.340353 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.340314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:28.340833 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.340384 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:28.340833 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.340447 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:28.340833 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:28.340531 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:28.340833 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:28.340595 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls podName:a5125243-6f0d-4b3b-a7dc-3c481a10fdcb nodeName:}" failed. No retries permitted until 2026-04-16 20:14:44.340579204 +0000 UTC m=+168.780224159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcndm" (UID: "a5125243-6f0d-4b3b-a7dc-3c481a10fdcb") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:14:28.341055 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.340999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d1461df-f676-49a3-a685-4bddd22c8287-service-ca-bundle\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:28.342702 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.342682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d1461df-f676-49a3-a685-4bddd22c8287-metrics-certs\") pod \"router-default-7b64899dcd-25ctw\" (UID: \"3d1461df-f676-49a3-a685-4bddd22c8287\") " pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:28.443773 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.443744 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:28.561449 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.561420 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b64899dcd-25ctw"] Apr 16 20:14:28.564558 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:28.564533 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d1461df_f676_49a3_a685_4bddd22c8287.slice/crio-9eaf7ef8c9070f4c39521bab74d06abdb9f59af5ed50793202a5961797c1ae95 WatchSource:0}: Error finding container 9eaf7ef8c9070f4c39521bab74d06abdb9f59af5ed50793202a5961797c1ae95: Status 404 returned error can't find the container with id 9eaf7ef8c9070f4c39521bab74d06abdb9f59af5ed50793202a5961797c1ae95 Apr 16 20:14:28.764133 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.764105 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b64899dcd-25ctw" event={"ID":"3d1461df-f676-49a3-a685-4bddd22c8287","Type":"ContainerStarted","Data":"84791dbf517e14af6fc96eb2d42dde09095436657d0bac8c043f1170885d5f0e"} Apr 16 20:14:28.764133 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.764138 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b64899dcd-25ctw" event={"ID":"3d1461df-f676-49a3-a685-4bddd22c8287","Type":"ContainerStarted","Data":"9eaf7ef8c9070f4c39521bab74d06abdb9f59af5ed50793202a5961797c1ae95"} Apr 16 20:14:28.783588 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.783539 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b64899dcd-25ctw" podStartSLOduration=16.783525662 podStartE2EDuration="16.783525662s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:28.782762038 +0000 UTC m=+153.222407003" watchObservedRunningTime="2026-04-16 20:14:28.783525662 +0000 UTC m=+153.223170625" Apr 16 20:14:28.844001 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.843971 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:28.846199 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.846157 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"image-registry-55f777c949-lbh9s\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:28.906659 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:28.906630 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:29.019775 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.019743 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:14:29.022929 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:29.022895 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8034ee30_2d4d_4d63_b281_5171ce85150a.slice/crio-e5dbaf800c9fb85111702ccc8bb700537f4cda82db0960a0f7587255f61b568b WatchSource:0}: Error finding container e5dbaf800c9fb85111702ccc8bb700537f4cda82db0960a0f7587255f61b568b: Status 404 returned error can't find the container with id e5dbaf800c9fb85111702ccc8bb700537f4cda82db0960a0f7587255f61b568b Apr 16 20:14:29.444232 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.444204 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:29.446752 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.446732 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:29.768396 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.768364 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" event={"ID":"8034ee30-2d4d-4d63-b281-5171ce85150a","Type":"ContainerStarted","Data":"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67"} Apr 16 20:14:29.768396 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.768406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" event={"ID":"8034ee30-2d4d-4d63-b281-5171ce85150a","Type":"ContainerStarted","Data":"e5dbaf800c9fb85111702ccc8bb700537f4cda82db0960a0f7587255f61b568b"} Apr 16 20:14:29.768637 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.768524 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:14:29.768637 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.768556 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:29.769761 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.769740 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b64899dcd-25ctw" Apr 16 20:14:29.788859 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:29.788813 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" podStartSLOduration=17.788800653 podStartE2EDuration="17.788800653s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:29.787708128 +0000 UTC m=+154.227353092" watchObservedRunningTime="2026-04-16 20:14:29.788800653 +0000 UTC m=+154.228445637" Apr 16 20:14:31.953266 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:31.953227 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-k5t5f" podUID="96e95540-055f-454d-b85c-31093fbd7bbf" Apr 16 20:14:31.979816 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:31.979785 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t7fts" podUID="42bbcb75-7cbe-482c-8c08-a9ceeb1c626d" Apr 16 20:14:32.268374 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:32.268332 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jdfnl" podUID="02d874be-6206-4feb-99d1-3539318d290b" Apr 16 20:14:32.774956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:32.774927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:14:32.774956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:32.774961 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:36.805056 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.805010 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:36.807236 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.807218 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96e95540-055f-454d-b85c-31093fbd7bbf-metrics-tls\") pod \"dns-default-k5t5f\" (UID: \"96e95540-055f-454d-b85c-31093fbd7bbf\") " pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:36.905839 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.905796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:14:36.908052 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.908031 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42bbcb75-7cbe-482c-8c08-a9ceeb1c626d-cert\") pod \"ingress-canary-t7fts\" (UID: \"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d\") " pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:14:36.978344 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.978318 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:14:36.978905 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.978884 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:14:36.986498 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.986481 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:36.986612 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:36.986568 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t7fts" Apr 16 20:14:37.117517 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:37.117479 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k5t5f"] Apr 16 20:14:37.122335 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:37.122311 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e95540_055f_454d_b85c_31093fbd7bbf.slice/crio-c425439ee4db77c9979f9f776ecc99b2c463193e55c8dd5f03d0ae387737a3b6 WatchSource:0}: Error finding container c425439ee4db77c9979f9f776ecc99b2c463193e55c8dd5f03d0ae387737a3b6: Status 404 returned error can't find the container with id c425439ee4db77c9979f9f776ecc99b2c463193e55c8dd5f03d0ae387737a3b6 Apr 16 20:14:37.128546 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:37.128524 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t7fts"] Apr 16 20:14:37.131611 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:37.131569 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42bbcb75_7cbe_482c_8c08_a9ceeb1c626d.slice/crio-2105f353d4b980f419afdef0926d7023b868b5af39f0458c066e4342651cb89a WatchSource:0}: Error finding container 2105f353d4b980f419afdef0926d7023b868b5af39f0458c066e4342651cb89a: Status 404 returned error can't find the container with id 2105f353d4b980f419afdef0926d7023b868b5af39f0458c066e4342651cb89a Apr 16 20:14:37.789283 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:37.789245 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7fts" event={"ID":"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d","Type":"ContainerStarted","Data":"2105f353d4b980f419afdef0926d7023b868b5af39f0458c066e4342651cb89a"} Apr 16 20:14:37.790450 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:37.790418 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5t5f" event={"ID":"96e95540-055f-454d-b85c-31093fbd7bbf","Type":"ContainerStarted","Data":"c425439ee4db77c9979f9f776ecc99b2c463193e55c8dd5f03d0ae387737a3b6"} Apr 16 20:14:38.307137 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.307100 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:14:38.355778 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.355749 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x9xwk"] Apr 16 20:14:38.359005 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.358983 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.363819 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.363795 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:38.363819 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.363809 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:38.363984 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.363908 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvwln\"" Apr 16 20:14:38.423146 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.423102 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x9xwk"] Apr 16 20:14:38.518527 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.518482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/025e9cf8-7d1b-4745-92f1-6017cc3167d6-data-volume\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.518714 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.518528 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfm4\" (UniqueName: \"kubernetes.io/projected/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-api-access-4hfm4\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.518714 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.518592 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/025e9cf8-7d1b-4745-92f1-6017cc3167d6-crio-socket\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.518811 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.518747 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.518811 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.518787 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/025e9cf8-7d1b-4745-92f1-6017cc3167d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620220 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620186 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620220 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/025e9cf8-7d1b-4745-92f1-6017cc3167d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620455 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620284 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/025e9cf8-7d1b-4745-92f1-6017cc3167d6-data-volume\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620455 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfm4\" (UniqueName: \"kubernetes.io/projected/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-api-access-4hfm4\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/025e9cf8-7d1b-4745-92f1-6017cc3167d6-crio-socket\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620651 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/025e9cf8-7d1b-4745-92f1-6017cc3167d6-crio-socket\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620719 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/025e9cf8-7d1b-4745-92f1-6017cc3167d6-data-volume\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.620879 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.620858 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.622914 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.622890 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/025e9cf8-7d1b-4745-92f1-6017cc3167d6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.633468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.633441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfm4\" (UniqueName: \"kubernetes.io/projected/025e9cf8-7d1b-4745-92f1-6017cc3167d6-kube-api-access-4hfm4\") pod \"insights-runtime-extractor-x9xwk\" (UID: \"025e9cf8-7d1b-4745-92f1-6017cc3167d6\") " pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.670194 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.670174 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x9xwk" Apr 16 20:14:38.964519 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:38.964487 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x9xwk"] Apr 16 20:14:38.969593 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:38.969564 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025e9cf8_7d1b_4745_92f1_6017cc3167d6.slice/crio-47638b3874944aad59f33ee73278c4bc92cbad23df010430a4cead53d22f771d WatchSource:0}: Error finding container 47638b3874944aad59f33ee73278c4bc92cbad23df010430a4cead53d22f771d: Status 404 returned error can't find the container with id 47638b3874944aad59f33ee73278c4bc92cbad23df010430a4cead53d22f771d Apr 16 20:14:39.799405 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.799313 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x9xwk" event={"ID":"025e9cf8-7d1b-4745-92f1-6017cc3167d6","Type":"ContainerStarted","Data":"81eba425c4293301f2364ceb5d3ba27ce49beaec607afc5c90132fba18be6ece"} Apr 16 20:14:39.799405 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.799350 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x9xwk" event={"ID":"025e9cf8-7d1b-4745-92f1-6017cc3167d6","Type":"ContainerStarted","Data":"42e8354de967aed7d66f773c338c9a04accec1c00e4efb3e2501d1f8bb2d6733"} Apr 16 20:14:39.799405 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.799363 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x9xwk" event={"ID":"025e9cf8-7d1b-4745-92f1-6017cc3167d6","Type":"ContainerStarted","Data":"47638b3874944aad59f33ee73278c4bc92cbad23df010430a4cead53d22f771d"} Apr 16 20:14:39.800498 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.800475 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t7fts" event={"ID":"42bbcb75-7cbe-482c-8c08-a9ceeb1c626d","Type":"ContainerStarted","Data":"c7a0dac876fbfce1dd19a778a0d0032c03265a7f2a1474774e6ac96f97e353cb"} Apr 16 20:14:39.801866 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.801847 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5t5f" event={"ID":"96e95540-055f-454d-b85c-31093fbd7bbf","Type":"ContainerStarted","Data":"004b80cbd67b789b7dc50c11b4714b4cc989f03202ea7a417d5dedcaac8fc85f"} Apr 16 20:14:39.801866 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.801869 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k5t5f" event={"ID":"96e95540-055f-454d-b85c-31093fbd7bbf","Type":"ContainerStarted","Data":"91bc448cd3b81e11e0e33015caf4d019b50046eb0b98baee4655ac9f35552026"} Apr 16 20:14:39.802050 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.802039 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:39.816249 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.816187 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t7fts" podStartSLOduration=130.106059227 podStartE2EDuration="2m11.816172803s" podCreationTimestamp="2026-04-16 20:12:28 +0000 UTC" firstStartedPulling="2026-04-16 20:14:37.133251088 +0000 UTC m=+161.572896033" lastFinishedPulling="2026-04-16 20:14:38.843364652 +0000 UTC m=+163.283009609" observedRunningTime="2026-04-16 20:14:39.81583266 +0000 UTC m=+164.255477637" watchObservedRunningTime="2026-04-16 20:14:39.816172803 +0000 UTC m=+164.255817772" Apr 16 20:14:39.833799 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:39.833748 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k5t5f" podStartSLOduration=130.11984699 podStartE2EDuration="2m11.833733326s" podCreationTimestamp="2026-04-16 20:12:28 +0000 UTC" firstStartedPulling="2026-04-16 20:14:37.124431571 +0000 UTC m=+161.564076514" lastFinishedPulling="2026-04-16 20:14:38.838317895 +0000 UTC m=+163.277962850" observedRunningTime="2026-04-16 20:14:39.832337186 +0000 UTC m=+164.271982150" watchObservedRunningTime="2026-04-16 20:14:39.833733326 +0000 UTC m=+164.273378291" Apr 16 20:14:41.809482 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:41.809447 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x9xwk" event={"ID":"025e9cf8-7d1b-4745-92f1-6017cc3167d6","Type":"ContainerStarted","Data":"3236e73917fea4b38741a00e600cd354afa47bd9aeeb220c0533d035db96f85b"} Apr 16 20:14:41.832848 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:41.832798 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x9xwk" podStartSLOduration=2.009679137 podStartE2EDuration="3.83278529s" podCreationTimestamp="2026-04-16 20:14:38 +0000 UTC" firstStartedPulling="2026-04-16 20:14:39.035182836 +0000 UTC m=+163.474827792" lastFinishedPulling="2026-04-16 20:14:40.858288997 +0000 UTC m=+165.297933945" observedRunningTime="2026-04-16 20:14:41.831769531 +0000 UTC m=+166.271414494" watchObservedRunningTime="2026-04-16 20:14:41.83278529 +0000 UTC m=+166.272430245" Apr 16 20:14:44.364361 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:44.364327 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:44.366606 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:44.366579 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a5125243-6f0d-4b3b-a7dc-3c481a10fdcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcndm\" (UID: \"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:44.639925 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:44.639848 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" Apr 16 20:14:44.775656 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:44.775627 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm"] Apr 16 20:14:44.778976 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:44.778948 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5125243_6f0d_4b3b_a7dc_3c481a10fdcb.slice/crio-2be82f6c9316ccc4b22f4fcc5d924ed61865d0ed717b07248ea9196644212be1 WatchSource:0}: Error finding container 2be82f6c9316ccc4b22f4fcc5d924ed61865d0ed717b07248ea9196644212be1: Status 404 returned error can't find the container with id 2be82f6c9316ccc4b22f4fcc5d924ed61865d0ed717b07248ea9196644212be1 Apr 16 20:14:44.817310 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:44.817282 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" event={"ID":"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb","Type":"ContainerStarted","Data":"2be82f6c9316ccc4b22f4fcc5d924ed61865d0ed717b07248ea9196644212be1"} Apr 16 20:14:46.823798 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:46.823764 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" event={"ID":"a5125243-6f0d-4b3b-a7dc-3c481a10fdcb","Type":"ContainerStarted","Data":"21d9cb96b576862081e59dc990b5d9fde8155b1a1dd669beb72a01ec5636330f"} Apr 16 20:14:46.844650 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:46.844591 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcndm" podStartSLOduration=33.387805353 podStartE2EDuration="34.844576642s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:44.780817951 +0000 UTC m=+169.220462897" lastFinishedPulling="2026-04-16 20:14:46.237589241 +0000 UTC m=+170.677234186" observedRunningTime="2026-04-16 20:14:46.843779258 +0000 UTC m=+171.283424223" watchObservedRunningTime="2026-04-16 20:14:46.844576642 +0000 UTC m=+171.284221607" Apr 16 20:14:47.244356 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:47.244335 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:14:48.312450 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:48.312418 2566 patch_prober.go:28] interesting pod/image-registry-55f777c949-lbh9s container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:14:48.312826 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:48.312466 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:49.785906 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.785878 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s7xs5"] Apr 16 20:14:49.788958 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.788942 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.792197 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.792166 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qdd4q\"" Apr 16 20:14:49.792333 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.792207 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:49.792491 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.792473 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 20:14:49.792568 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.792511 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 20:14:49.796749 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.796728 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s7xs5"] Apr 16 20:14:49.806663 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.806646 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k5t5f" Apr 16 20:14:49.808165 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.808146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.808246 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.808193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.808246 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.808235 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a07d554b-3b76-42a4-90f0-795d65c8a58a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.808316 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.808283 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5h9\" (UniqueName: \"kubernetes.io/projected/a07d554b-3b76-42a4-90f0-795d65c8a58a-kube-api-access-zz5h9\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.909288 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.909246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a07d554b-3b76-42a4-90f0-795d65c8a58a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.909479 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.909318 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5h9\" (UniqueName: \"kubernetes.io/projected/a07d554b-3b76-42a4-90f0-795d65c8a58a-kube-api-access-zz5h9\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.909479 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.909395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.909635 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.909567 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.909736 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:49.909715 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 20:14:49.909833 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:49.909817 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls podName:a07d554b-3b76-42a4-90f0-795d65c8a58a nodeName:}" failed. No retries permitted until 2026-04-16 20:14:50.40979566 +0000 UTC m=+174.849440603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-s7xs5" (UID: "a07d554b-3b76-42a4-90f0-795d65c8a58a") : secret "prometheus-operator-tls" not found Apr 16 20:14:49.910021 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.910003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a07d554b-3b76-42a4-90f0-795d65c8a58a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.911726 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.911709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:49.919633 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:49.919593 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5h9\" (UniqueName: \"kubernetes.io/projected/a07d554b-3b76-42a4-90f0-795d65c8a58a-kube-api-access-zz5h9\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:50.412833 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:50.412802 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:50.415152 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:50.415131 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a07d554b-3b76-42a4-90f0-795d65c8a58a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-s7xs5\" (UID: \"a07d554b-3b76-42a4-90f0-795d65c8a58a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:50.698161 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:50.698068 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" Apr 16 20:14:50.816069 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:50.816043 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-s7xs5"] Apr 16 20:14:50.819522 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:50.819493 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda07d554b_3b76_42a4_90f0_795d65c8a58a.slice/crio-4117184fabaf9c250b2bfe865a9dd39870af142f07d2e216abaa0e4e602add4a WatchSource:0}: Error finding container 4117184fabaf9c250b2bfe865a9dd39870af142f07d2e216abaa0e4e602add4a: Status 404 returned error can't find the container with id 4117184fabaf9c250b2bfe865a9dd39870af142f07d2e216abaa0e4e602add4a Apr 16 20:14:50.835420 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:50.835394 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" event={"ID":"a07d554b-3b76-42a4-90f0-795d65c8a58a","Type":"ContainerStarted","Data":"4117184fabaf9c250b2bfe865a9dd39870af142f07d2e216abaa0e4e602add4a"} Apr 16 20:14:52.841230 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:52.841196 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" event={"ID":"a07d554b-3b76-42a4-90f0-795d65c8a58a","Type":"ContainerStarted","Data":"2a106aaa93e2dee0ddbec30e29bceb051bc1b97d90558bb066cbf92197fc308a"} Apr 16 20:14:52.841230 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:52.841236 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" event={"ID":"a07d554b-3b76-42a4-90f0-795d65c8a58a","Type":"ContainerStarted","Data":"a8268203ba3ccf1091d583970bec026ab57fc8caae68145cb4f351687511bac1"} Apr 16 20:14:52.860343 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:52.860297 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-s7xs5" podStartSLOduration=2.7057712499999997 podStartE2EDuration="3.860284265s" podCreationTimestamp="2026-04-16 20:14:49 +0000 UTC" firstStartedPulling="2026-04-16 20:14:50.821358917 +0000 UTC m=+175.261003859" lastFinishedPulling="2026-04-16 20:14:51.97587193 +0000 UTC m=+176.415516874" observedRunningTime="2026-04-16 20:14:52.858822137 +0000 UTC m=+177.298467102" watchObservedRunningTime="2026-04-16 20:14:52.860284265 +0000 UTC m=+177.299929228" Apr 16 20:14:55.205515 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.205482 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hlml2"] Apr 16 20:14:55.209472 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.209331 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.213442 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.213419 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:55.213589 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.213569 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:55.213682 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.213664 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:55.213726 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.213676 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4rxk\"" Apr 16 20:14:55.245210 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245182 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-wtmp\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245235 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-688hq\" (UniqueName: \"kubernetes.io/projected/d0f052db-d4c8-42a7-8862-1360fad89eb4-kube-api-access-688hq\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245251 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245297 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-sys\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-root\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245363 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-metrics-client-ca\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245414 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.245544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.245457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-textfile\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.346690 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346661 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-wtmp\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.346841 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346698 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-688hq\" (UniqueName: \"kubernetes.io/projected/d0f052db-d4c8-42a7-8862-1360fad89eb4-kube-api-access-688hq\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.346841 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346724 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.346841 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-sys\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346841 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-wtmp\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-sys\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:55.346821 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346884 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-root\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346915 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-metrics-client-ca\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346918 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d0f052db-d4c8-42a7-8862-1360fad89eb4-root\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:14:55.346936 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls podName:d0f052db-d4c8-42a7-8862-1360fad89eb4 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:55.846916823 +0000 UTC m=+180.286561764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls") pod "node-exporter-hlml2" (UID: "d0f052db-d4c8-42a7-8862-1360fad89eb4") : secret "node-exporter-tls" not found Apr 16 20:14:55.347025 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.346988 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347416 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.347052 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-textfile\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347416 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.347132 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347532 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.347479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.347532 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.347506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0f052db-d4c8-42a7-8862-1360fad89eb4-metrics-client-ca\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.348022 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.348000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-textfile\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.349780 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.349759 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.359828 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.359800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-688hq\" (UniqueName: \"kubernetes.io/projected/d0f052db-d4c8-42a7-8862-1360fad89eb4-kube-api-access-688hq\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.850962 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.850922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:55.853154 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:55.853126 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d0f052db-d4c8-42a7-8862-1360fad89eb4-node-exporter-tls\") pod \"node-exporter-hlml2\" (UID: \"d0f052db-d4c8-42a7-8862-1360fad89eb4\") " pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:56.120590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:56.120516 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4rxk\"" Apr 16 20:14:56.129458 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:56.129423 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hlml2" Apr 16 20:14:56.137551 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:14:56.137525 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f052db_d4c8_42a7_8862_1360fad89eb4.slice/crio-5d82fbb2b7c19c4cd8091b56340f7cb71c5390a1cb6e9affb1c3796ee62e676a WatchSource:0}: Error finding container 5d82fbb2b7c19c4cd8091b56340f7cb71c5390a1cb6e9affb1c3796ee62e676a: Status 404 returned error can't find the container with id 5d82fbb2b7c19c4cd8091b56340f7cb71c5390a1cb6e9affb1c3796ee62e676a Apr 16 20:14:56.853181 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:56.853141 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlml2" event={"ID":"d0f052db-d4c8-42a7-8862-1360fad89eb4","Type":"ContainerStarted","Data":"5d82fbb2b7c19c4cd8091b56340f7cb71c5390a1cb6e9affb1c3796ee62e676a"} Apr 16 20:14:57.856925 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:57.856887 2566 generic.go:358] "Generic (PLEG): container finished" podID="d0f052db-d4c8-42a7-8862-1360fad89eb4" containerID="660193074d81ae5ebb3a24484642cb8c33540d427f1daed68eb09aaecb766649" exitCode=0 Apr 16 20:14:57.857405 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:57.856973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlml2" event={"ID":"d0f052db-d4c8-42a7-8862-1360fad89eb4","Type":"ContainerDied","Data":"660193074d81ae5ebb3a24484642cb8c33540d427f1daed68eb09aaecb766649"} Apr 16 20:14:58.310971 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:58.310938 2566 patch_prober.go:28] interesting pod/image-registry-55f777c949-lbh9s container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:14:58.311135 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:58.310986 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:14:58.861039 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:58.861006 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlml2" event={"ID":"d0f052db-d4c8-42a7-8862-1360fad89eb4","Type":"ContainerStarted","Data":"9931748f1ce27b47e0bf270006d548252d7d525da09f978c0059fb5d921e48c8"} Apr 16 20:14:58.861039 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:58.861039 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hlml2" event={"ID":"d0f052db-d4c8-42a7-8862-1360fad89eb4","Type":"ContainerStarted","Data":"79ef87758df0ab1befefafe79288bd745f03aedb97c4b27040dd518b6045721b"} Apr 16 20:14:58.882870 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:14:58.882824 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hlml2" podStartSLOduration=3.162879537 podStartE2EDuration="3.882810028s" podCreationTimestamp="2026-04-16 20:14:55 +0000 UTC" firstStartedPulling="2026-04-16 20:14:56.139138475 +0000 UTC m=+180.578783421" lastFinishedPulling="2026-04-16 20:14:56.859068967 +0000 UTC m=+181.298713912" observedRunningTime="2026-04-16 20:14:58.88130677 +0000 UTC m=+183.320951738" watchObservedRunningTime="2026-04-16 20:14:58.882810028 +0000 UTC m=+183.322454991" Apr 16 20:15:00.134635 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.134591 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-xg7v7"] Apr 16 20:15:00.137092 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.137071 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:00.140246 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.140229 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:15:00.140361 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.140280 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:15:00.140413 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.140396 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-g24tb\"" Apr 16 20:15:00.154149 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.154122 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xg7v7"] Apr 16 20:15:00.187347 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.187315 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbzf\" (UniqueName: \"kubernetes.io/projected/2a8395f1-29fd-4e8d-aead-f6211089cc8d-kube-api-access-llbzf\") pod \"downloads-6bcc868b7-xg7v7\" (UID: \"2a8395f1-29fd-4e8d-aead-f6211089cc8d\") " pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:00.288343 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.288314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llbzf\" (UniqueName: \"kubernetes.io/projected/2a8395f1-29fd-4e8d-aead-f6211089cc8d-kube-api-access-llbzf\") pod \"downloads-6bcc868b7-xg7v7\" (UID: \"2a8395f1-29fd-4e8d-aead-f6211089cc8d\") " pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:00.297631 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.297591 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbzf\" (UniqueName: \"kubernetes.io/projected/2a8395f1-29fd-4e8d-aead-f6211089cc8d-kube-api-access-llbzf\") pod \"downloads-6bcc868b7-xg7v7\" (UID: \"2a8395f1-29fd-4e8d-aead-f6211089cc8d\") " pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:00.445952 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.445880 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:00.540495 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.540468 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5fb6b49696-77fxj"] Apr 16 20:15:00.543164 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.543147 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.545523 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.545495 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 20:15:00.546026 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.546002 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 20:15:00.546134 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.546072 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 20:15:00.546134 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.546079 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 20:15:00.546134 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.546122 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 20:15:00.546525 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.546399 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-4qjcw\"" Apr 16 20:15:00.551030 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.551009 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 20:15:00.561429 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.561411 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fb6b49696-77fxj"] Apr 16 20:15:00.580029 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.580010 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xg7v7"] Apr 16 20:15:00.583034 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:15:00.583013 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8395f1_29fd_4e8d_aead_f6211089cc8d.slice/crio-e46312c87652671321d55ac1aa12a3f4bec4c8bbe6085fe376bc0c555eabfc36 WatchSource:0}: Error finding container e46312c87652671321d55ac1aa12a3f4bec4c8bbe6085fe376bc0c555eabfc36: Status 404 returned error can't find the container with id e46312c87652671321d55ac1aa12a3f4bec4c8bbe6085fe376bc0c555eabfc36 Apr 16 20:15:00.590542 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590520 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-federate-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590588 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-metrics-client-ca\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590729 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590674 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590729 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590716 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590793 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590739 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590793 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590783 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-serving-certs-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.590854 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.590814 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft27\" (UniqueName: \"kubernetes.io/projected/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-kube-api-access-vft27\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691463 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691431 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691499 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691517 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-serving-certs-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691537 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vft27\" (UniqueName: \"kubernetes.io/projected/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-kube-api-access-vft27\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691585 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-federate-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691642 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.691956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.691658 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-metrics-client-ca\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.692335 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.692308 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-serving-certs-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.692430 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.692413 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-metrics-client-ca\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.692641 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.692593 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.694825 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.694799 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.694943 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.694839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-secret-telemeter-client\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.695004 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.694943 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-telemeter-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.695004 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.694948 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-federate-client-tls\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.700050 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.699998 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft27\" (UniqueName: \"kubernetes.io/projected/8045ea37-6b10-43e2-90c7-50ee6a8ba3f3-kube-api-access-vft27\") pod \"telemeter-client-5fb6b49696-77fxj\" (UID: \"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3\") " pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.854976 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.854942 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" Apr 16 20:15:00.869741 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.869710 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xg7v7" event={"ID":"2a8395f1-29fd-4e8d-aead-f6211089cc8d","Type":"ContainerStarted","Data":"e46312c87652671321d55ac1aa12a3f4bec4c8bbe6085fe376bc0c555eabfc36"} Apr 16 20:15:00.988316 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:00.988287 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fb6b49696-77fxj"] Apr 16 20:15:00.994049 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:15:00.994018 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8045ea37_6b10_43e2_90c7_50ee6a8ba3f3.slice/crio-066922910f9af25d5857b612618a997356c02e0bdcb27f12db7f761922e49511 WatchSource:0}: Error finding container 066922910f9af25d5857b612618a997356c02e0bdcb27f12db7f761922e49511: Status 404 returned error can't find the container with id 066922910f9af25d5857b612618a997356c02e0bdcb27f12db7f761922e49511 Apr 16 20:15:01.874273 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:01.874213 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" event={"ID":"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3","Type":"ContainerStarted","Data":"066922910f9af25d5857b612618a997356c02e0bdcb27f12db7f761922e49511"} Apr 16 20:15:02.879114 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:02.879075 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" event={"ID":"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3","Type":"ContainerStarted","Data":"4f306a8ce4f99e943dd8c8b009df79e7546af276f0c78f22ab2cbb51dbc945c7"} Apr 16 20:15:03.326614 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.326559 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" containerID="cri-o://5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67" gracePeriod=30 Apr 16 20:15:03.873149 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.873123 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:15:03.883578 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.883551 2566 generic.go:358] "Generic (PLEG): container finished" podID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerID="5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67" exitCode=0 Apr 16 20:15:03.883937 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.883638 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" Apr 16 20:15:03.883937 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.883635 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" event={"ID":"8034ee30-2d4d-4d63-b281-5171ce85150a","Type":"ContainerDied","Data":"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67"} Apr 16 20:15:03.883937 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.883743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f777c949-lbh9s" event={"ID":"8034ee30-2d4d-4d63-b281-5171ce85150a","Type":"ContainerDied","Data":"e5dbaf800c9fb85111702ccc8bb700537f4cda82db0960a0f7587255f61b568b"} Apr 16 20:15:03.883937 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.883766 2566 scope.go:117] "RemoveContainer" containerID="5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67" Apr 16 20:15:03.885451 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.885429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" event={"ID":"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3","Type":"ContainerStarted","Data":"22d090adb7becb28ae512033527f7440e8e619b7d01515f5f10143e1b9a089ab"} Apr 16 20:15:03.894359 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.894289 2566 scope.go:117] "RemoveContainer" containerID="5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67" Apr 16 20:15:03.894840 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:15:03.894762 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67\": container with ID starting with 5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67 not found: ID does not exist" containerID="5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67" Apr 16 20:15:03.894840 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.894797 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67"} err="failed to get container status \"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67\": rpc error: code = NotFound desc = could not find container \"5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67\": container with ID starting with 5919422b82f45aa62465202d189111af73ed4ac73cf7969c508815bac686bf67 not found: ID does not exist" Apr 16 20:15:03.923402 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923377 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923518 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923422 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923518 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923460 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923518 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923499 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6q2\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923646 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923565 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923646 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923592 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923747 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923657 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.923747 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.923687 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration\") pod \"8034ee30-2d4d-4d63-b281-5171ce85150a\" (UID: \"8034ee30-2d4d-4d63-b281-5171ce85150a\") " Apr 16 20:15:03.925679 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.925634 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:03.926191 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.925970 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:03.927111 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.926447 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:03.928182 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.928139 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:03.928500 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.928456 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:03.930018 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.929988 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:03.931091 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.931045 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2" (OuterVolumeSpecName: "kube-api-access-sm6q2") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "kube-api-access-sm6q2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:03.936112 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:03.936084 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8034ee30-2d4d-4d63-b281-5171ce85150a" (UID: "8034ee30-2d4d-4d63-b281-5171ce85150a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:15:04.024971 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.024941 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8034ee30-2d4d-4d63-b281-5171ce85150a-ca-trust-extracted\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.024971 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.024975 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-installation-pull-secrets\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.024992 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8034ee30-2d4d-4d63-b281-5171ce85150a-image-registry-private-configuration\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.025008 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-bound-sa-token\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.025022 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-certificates\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.025038 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-registry-tls\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.025054 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sm6q2\" (UniqueName: \"kubernetes.io/projected/8034ee30-2d4d-4d63-b281-5171ce85150a-kube-api-access-sm6q2\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.025195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.025067 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8034ee30-2d4d-4d63-b281-5171ce85150a-trusted-ca\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:04.211271 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.211240 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:15:04.215381 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.215346 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55f777c949-lbh9s"] Apr 16 20:15:04.248096 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.248064 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" path="/var/lib/kubelet/pods/8034ee30-2d4d-4d63-b281-5171ce85150a/volumes" Apr 16 20:15:04.892391 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.892352 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" event={"ID":"8045ea37-6b10-43e2-90c7-50ee6a8ba3f3","Type":"ContainerStarted","Data":"40931a9acb4e82b316ad95ed2dbecac711db45db650cca8baed26d3d38353762"} Apr 16 20:15:04.921367 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:04.921317 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5fb6b49696-77fxj" podStartSLOduration=2.172803968 podStartE2EDuration="4.921302844s" podCreationTimestamp="2026-04-16 20:15:00 +0000 UTC" firstStartedPulling="2026-04-16 20:15:00.996456669 +0000 UTC m=+185.436101614" lastFinishedPulling="2026-04-16 20:15:03.744955547 +0000 UTC m=+188.184600490" observedRunningTime="2026-04-16 20:15:04.919910271 +0000 UTC m=+189.359555235" watchObservedRunningTime="2026-04-16 20:15:04.921302844 +0000 UTC m=+189.360947808" Apr 16 20:15:07.208279 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.206517 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:07.208279 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.206880 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" Apr 16 20:15:07.208279 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.206897 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" Apr 16 20:15:07.208279 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.206976 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8034ee30-2d4d-4d63-b281-5171ce85150a" containerName="registry" Apr 16 20:15:07.209002 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.208983 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.211294 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.211251 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:15:07.211424 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.211307 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:15:07.211424 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.211254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:15:07.213046 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.212107 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:15:07.213046 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.212342 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:15:07.213046 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.212593 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-926v5\"" Apr 16 20:15:07.220682 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.220657 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:07.253071 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253040 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.253207 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253083 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.253207 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253107 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.253298 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253205 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66l6r\" (UniqueName: \"kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.253298 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253255 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.253377 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.253304 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354186 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354150 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354362 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354362 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354258 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354362 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354285 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354362 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354339 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66l6r\" (UniqueName: \"kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.354554 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.354375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.355127 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.355097 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.355225 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.355125 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.355332 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.355311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.357239 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.357221 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.357451 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.357432 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.362826 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.362803 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66l6r\" (UniqueName: \"kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r\") pod \"console-55b879477f-vd57b\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.522355 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.522303 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:07.673355 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.673326 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:07.675426 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:15:07.675395 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b8f027_972a_414f_a87b_8ff7e7e271ef.slice/crio-4b09a50c20c007ccfafa81970879141d41c1f0727ea23d604530740438c610ac WatchSource:0}: Error finding container 4b09a50c20c007ccfafa81970879141d41c1f0727ea23d604530740438c610ac: Status 404 returned error can't find the container with id 4b09a50c20c007ccfafa81970879141d41c1f0727ea23d604530740438c610ac Apr 16 20:15:07.903350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:07.903266 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b879477f-vd57b" event={"ID":"23b8f027-972a-414f-a87b-8ff7e7e271ef","Type":"ContainerStarted","Data":"4b09a50c20c007ccfafa81970879141d41c1f0727ea23d604530740438c610ac"} Apr 16 20:15:17.934161 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:17.934074 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xg7v7" event={"ID":"2a8395f1-29fd-4e8d-aead-f6211089cc8d","Type":"ContainerStarted","Data":"ea54fda7f1f18ec6cfd2657f5cb935d499c1f068dc51195d11158ba3e7f6c377"} Apr 16 20:15:17.934675 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:17.934396 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:17.935873 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:17.935836 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b879477f-vd57b" event={"ID":"23b8f027-972a-414f-a87b-8ff7e7e271ef","Type":"ContainerStarted","Data":"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e"} Apr 16 20:15:17.949108 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:17.949079 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-xg7v7" Apr 16 20:15:17.963185 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:17.963134 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-xg7v7" podStartSLOduration=0.951831039 podStartE2EDuration="17.96312089s" podCreationTimestamp="2026-04-16 20:15:00 +0000 UTC" firstStartedPulling="2026-04-16 20:15:00.584977913 +0000 UTC m=+185.024622855" lastFinishedPulling="2026-04-16 20:15:17.596267764 +0000 UTC m=+202.035912706" observedRunningTime="2026-04-16 20:15:17.961411521 +0000 UTC m=+202.401056487" watchObservedRunningTime="2026-04-16 20:15:17.96312089 +0000 UTC m=+202.402765853" Apr 16 20:15:18.006870 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:18.006819 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55b879477f-vd57b" podStartSLOduration=1.1302969250000001 podStartE2EDuration="11.006802437s" podCreationTimestamp="2026-04-16 20:15:07 +0000 UTC" firstStartedPulling="2026-04-16 20:15:07.677957957 +0000 UTC m=+192.117602904" lastFinishedPulling="2026-04-16 20:15:17.554463468 +0000 UTC m=+201.994108416" observedRunningTime="2026-04-16 20:15:17.984116926 +0000 UTC m=+202.423761890" watchObservedRunningTime="2026-04-16 20:15:18.006802437 +0000 UTC m=+202.446447403" Apr 16 20:15:27.048962 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:27.048931 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:27.522976 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:27.522951 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:47.018894 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:47.018858 2566 generic.go:358] "Generic (PLEG): container finished" podID="387f5caa-46e7-4c7e-9eb3-9fececd0858d" containerID="b4ccbc7a1b12bb97bd4d599c897bd5065ad1bef312a0ef0eb37a0b474d9967d5" exitCode=0 Apr 16 20:15:47.019278 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:47.018931 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-q5swj" event={"ID":"387f5caa-46e7-4c7e-9eb3-9fececd0858d","Type":"ContainerDied","Data":"b4ccbc7a1b12bb97bd4d599c897bd5065ad1bef312a0ef0eb37a0b474d9967d5"} Apr 16 20:15:47.019316 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:47.019278 2566 scope.go:117] "RemoveContainer" containerID="b4ccbc7a1b12bb97bd4d599c897bd5065ad1bef312a0ef0eb37a0b474d9967d5" Apr 16 20:15:48.023750 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:48.023720 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-q5swj" event={"ID":"387f5caa-46e7-4c7e-9eb3-9fececd0858d","Type":"ContainerStarted","Data":"79bd25c287815080a104e4f89abcd7f94fb7b4e722acca48bd3c86b582ea4e78"} Apr 16 20:15:52.067362 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.067294 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55b879477f-vd57b" podUID="23b8f027-972a-414f-a87b-8ff7e7e271ef" containerName="console" containerID="cri-o://ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e" gracePeriod=15 Apr 16 20:15:52.334931 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.334910 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b879477f-vd57b_23b8f027-972a-414f-a87b-8ff7e7e271ef/console/0.log" Apr 16 20:15:52.335062 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.334982 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:52.419251 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419221 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419251 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419253 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419459 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419278 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419459 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419304 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419459 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419332 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66l6r\" (UniqueName: \"kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419459 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419363 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config\") pod \"23b8f027-972a-414f-a87b-8ff7e7e271ef\" (UID: \"23b8f027-972a-414f-a87b-8ff7e7e271ef\") " Apr 16 20:15:52.419761 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419720 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:52.419761 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419751 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config" (OuterVolumeSpecName: "console-config") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:52.419928 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.419820 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:52.421520 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.421491 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r" (OuterVolumeSpecName: "kube-api-access-66l6r") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "kube-api-access-66l6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:52.421626 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.421511 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:52.421626 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.421571 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "23b8f027-972a-414f-a87b-8ff7e7e271ef" (UID: "23b8f027-972a-414f-a87b-8ff7e7e271ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:52.520454 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520427 2566 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-config\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:52.520454 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520449 2566 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-oauth-serving-cert\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:52.520454 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520460 2566 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-serving-cert\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:52.520660 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520469 2566 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23b8f027-972a-414f-a87b-8ff7e7e271ef-service-ca\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:52.520660 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520479 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66l6r\" (UniqueName: \"kubernetes.io/projected/23b8f027-972a-414f-a87b-8ff7e7e271ef-kube-api-access-66l6r\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:52.520660 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:52.520488 2566 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23b8f027-972a-414f-a87b-8ff7e7e271ef-console-oauth-config\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:15:53.038434 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038408 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b879477f-vd57b_23b8f027-972a-414f-a87b-8ff7e7e271ef/console/0.log" Apr 16 20:15:53.038596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038447 2566 generic.go:358] "Generic (PLEG): container finished" podID="23b8f027-972a-414f-a87b-8ff7e7e271ef" containerID="ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e" exitCode=2 Apr 16 20:15:53.038596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038483 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b879477f-vd57b" event={"ID":"23b8f027-972a-414f-a87b-8ff7e7e271ef","Type":"ContainerDied","Data":"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e"} Apr 16 20:15:53.038596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038511 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b879477f-vd57b" Apr 16 20:15:53.038596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038519 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b879477f-vd57b" event={"ID":"23b8f027-972a-414f-a87b-8ff7e7e271ef","Type":"ContainerDied","Data":"4b09a50c20c007ccfafa81970879141d41c1f0727ea23d604530740438c610ac"} Apr 16 20:15:53.038596 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.038539 2566 scope.go:117] "RemoveContainer" containerID="ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e" Apr 16 20:15:53.047413 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.047394 2566 scope.go:117] "RemoveContainer" containerID="ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e" Apr 16 20:15:53.047678 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:15:53.047659 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e\": container with ID starting with ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e not found: ID does not exist" containerID="ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e" Apr 16 20:15:53.047745 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.047686 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e"} err="failed to get container status \"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e\": rpc error: code = NotFound desc = could not find container \"ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e\": container with ID starting with ca5257b91510a60a59f878fd95ff8ef6d7667d210432a1f7ac85a463e5a6f85e not found: ID does not exist" Apr 16 20:15:53.060612 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.060582 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:53.064247 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:53.064226 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55b879477f-vd57b"] Apr 16 20:15:54.248468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:15:54.248436 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b8f027-972a-414f-a87b-8ff7e7e271ef" path="/var/lib/kubelet/pods/23b8f027-972a-414f-a87b-8ff7e7e271ef/volumes" Apr 16 20:16:07.030787 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:07.030706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:16:07.032931 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:07.032909 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02d874be-6206-4feb-99d1-3539318d290b-metrics-certs\") pod \"network-metrics-daemon-jdfnl\" (UID: \"02d874be-6206-4feb-99d1-3539318d290b\") " pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:16:07.047472 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:07.047447 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:16:07.055308 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:07.055291 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jdfnl" Apr 16 20:16:07.171855 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:07.171699 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jdfnl"] Apr 16 20:16:07.174525 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:16:07.174491 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d874be_6206_4feb_99d1_3539318d290b.slice/crio-9fa1b832dca3915577a13a3dfa9808978c72134e9e735892995a47b1bcb6c946 WatchSource:0}: Error finding container 9fa1b832dca3915577a13a3dfa9808978c72134e9e735892995a47b1bcb6c946: Status 404 returned error can't find the container with id 9fa1b832dca3915577a13a3dfa9808978c72134e9e735892995a47b1bcb6c946 Apr 16 20:16:08.085090 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:08.085049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdfnl" event={"ID":"02d874be-6206-4feb-99d1-3539318d290b","Type":"ContainerStarted","Data":"9fa1b832dca3915577a13a3dfa9808978c72134e9e735892995a47b1bcb6c946"} Apr 16 20:16:09.089110 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:09.089063 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdfnl" event={"ID":"02d874be-6206-4feb-99d1-3539318d290b","Type":"ContainerStarted","Data":"e192f9e601f6592913a241fb977437593fecebf881f6a2cbe15e17e4e48ba5fb"} Apr 16 20:16:09.089110 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:09.089101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jdfnl" event={"ID":"02d874be-6206-4feb-99d1-3539318d290b","Type":"ContainerStarted","Data":"2e4c6cb57c5c2ae3e4d60bff227d28c8dc3fac428b10e8d0412f46facfb65066"} Apr 16 20:16:09.104074 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:16:09.104033 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jdfnl" podStartSLOduration=252.151785713 podStartE2EDuration="4m13.104019193s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:16:07.176266641 +0000 UTC m=+251.615911586" lastFinishedPulling="2026-04-16 20:16:08.128500121 +0000 UTC m=+252.568145066" observedRunningTime="2026-04-16 20:16:09.103023061 +0000 UTC m=+253.542668026" watchObservedRunningTime="2026-04-16 20:16:09.104019193 +0000 UTC m=+253.543664156" Apr 16 20:17:20.339283 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.339249 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt"] Apr 16 20:17:20.341577 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.339495 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23b8f027-972a-414f-a87b-8ff7e7e271ef" containerName="console" Apr 16 20:17:20.341577 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.339505 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b8f027-972a-414f-a87b-8ff7e7e271ef" containerName="console" Apr 16 20:17:20.341577 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.339547 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="23b8f027-972a-414f-a87b-8ff7e7e271ef" containerName="console" Apr 16 20:17:20.342438 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.342422 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.347499 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.347437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 20:17:20.347499 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.347489 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:17:20.347693 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.347499 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:17:20.347771 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.347753 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:17:20.362450 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.362429 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt"] Apr 16 20:17:20.449918 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.449888 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-tmp\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.450052 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.449963 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-klusterlet-config\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.450052 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.450000 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctbqq\" (UniqueName: \"kubernetes.io/projected/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-kube-api-access-ctbqq\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.551129 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.551101 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-klusterlet-config\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.551256 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.551136 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctbqq\" (UniqueName: \"kubernetes.io/projected/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-kube-api-access-ctbqq\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.551256 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.551164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-tmp\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.551501 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.551480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-tmp\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.553553 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.553533 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-klusterlet-config\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.559358 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.559340 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctbqq\" (UniqueName: \"kubernetes.io/projected/bf4c1fa5-010c-451f-a2e6-8fb84d8717fb-kube-api-access-ctbqq\") pod \"klusterlet-addon-workmgr-76cdb954db-qm8wt\" (UID: \"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.651256 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.651181 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:20.768388 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.768356 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt"] Apr 16 20:17:20.771860 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:17:20.771828 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4c1fa5_010c_451f_a2e6_8fb84d8717fb.slice/crio-a003ddc6628d125707bced9d9f200f297d59ccd3827e7fc8557f49436929e94b WatchSource:0}: Error finding container a003ddc6628d125707bced9d9f200f297d59ccd3827e7fc8557f49436929e94b: Status 404 returned error can't find the container with id a003ddc6628d125707bced9d9f200f297d59ccd3827e7fc8557f49436929e94b Apr 16 20:17:20.773464 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:20.773448 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:21.281316 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:21.281274 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" event={"ID":"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb","Type":"ContainerStarted","Data":"a003ddc6628d125707bced9d9f200f297d59ccd3827e7fc8557f49436929e94b"} Apr 16 20:17:25.295978 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:25.295938 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" event={"ID":"bf4c1fa5-010c-451f-a2e6-8fb84d8717fb","Type":"ContainerStarted","Data":"5f4c24868c12b2f3664d6ed8570adc4658d429c21da040d41486557fe05a25ea"} Apr 16 20:17:25.296377 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:25.296119 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:25.297628 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:25.297594 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" Apr 16 20:17:25.312228 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:25.312174 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-76cdb954db-qm8wt" podStartSLOduration=0.891613619 podStartE2EDuration="5.312157027s" podCreationTimestamp="2026-04-16 20:17:20 +0000 UTC" firstStartedPulling="2026-04-16 20:17:20.773591087 +0000 UTC m=+325.213236030" lastFinishedPulling="2026-04-16 20:17:25.194134482 +0000 UTC m=+329.633779438" observedRunningTime="2026-04-16 20:17:25.310590213 +0000 UTC m=+329.750235176" watchObservedRunningTime="2026-04-16 20:17:25.312157027 +0000 UTC m=+329.751801994" Apr 16 20:17:28.248100 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.248070 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw"] Apr 16 20:17:28.251473 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.251456 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.256582 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.256561 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 20:17:28.256727 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.256580 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-kxvb5\"" Apr 16 20:17:28.256799 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.256772 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 20:17:28.257211 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.257195 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 20:17:28.271654 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.271631 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw"] Apr 16 20:17:28.302233 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.302208 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzhw\" (UniqueName: \"kubernetes.io/projected/ee40948c-7863-4f9f-98d1-eb4fd26351b4-kube-api-access-4fzhw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.302343 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.302257 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ee40948c-7863-4f9f-98d1-eb4fd26351b4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.402590 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.402561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzhw\" (UniqueName: \"kubernetes.io/projected/ee40948c-7863-4f9f-98d1-eb4fd26351b4-kube-api-access-4fzhw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.402711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.402624 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ee40948c-7863-4f9f-98d1-eb4fd26351b4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.404877 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.404859 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/ee40948c-7863-4f9f-98d1-eb4fd26351b4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.410907 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.410884 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzhw\" (UniqueName: \"kubernetes.io/projected/ee40948c-7863-4f9f-98d1-eb4fd26351b4-kube-api-access-4fzhw\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw\" (UID: \"ee40948c-7863-4f9f-98d1-eb4fd26351b4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.561513 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.561456 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:28.684567 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:28.684535 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw"] Apr 16 20:17:28.688769 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:17:28.688730 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee40948c_7863_4f9f_98d1_eb4fd26351b4.slice/crio-82a050d9d526361b6dd56493f9519225580830b8ca6be34a7660852b3f8b8c80 WatchSource:0}: Error finding container 82a050d9d526361b6dd56493f9519225580830b8ca6be34a7660852b3f8b8c80: Status 404 returned error can't find the container with id 82a050d9d526361b6dd56493f9519225580830b8ca6be34a7660852b3f8b8c80 Apr 16 20:17:29.307255 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:29.307224 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" event={"ID":"ee40948c-7863-4f9f-98d1-eb4fd26351b4","Type":"ContainerStarted","Data":"82a050d9d526361b6dd56493f9519225580830b8ca6be34a7660852b3f8b8c80"} Apr 16 20:17:32.318476 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.318439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" event={"ID":"ee40948c-7863-4f9f-98d1-eb4fd26351b4","Type":"ContainerStarted","Data":"8e5f0611fd03bd3410a094f3d8035904b3eba6fabb5e4e52878c6b6befbd9c6f"} Apr 16 20:17:32.318851 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.318685 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:32.340584 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.340531 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" podStartSLOduration=0.900553046 podStartE2EDuration="4.340516764s" podCreationTimestamp="2026-04-16 20:17:28 +0000 UTC" firstStartedPulling="2026-04-16 20:17:28.690806106 +0000 UTC m=+333.130451047" lastFinishedPulling="2026-04-16 20:17:32.130769806 +0000 UTC m=+336.570414765" observedRunningTime="2026-04-16 20:17:32.338631964 +0000 UTC m=+336.778276925" watchObservedRunningTime="2026-04-16 20:17:32.340516764 +0000 UTC m=+336.780161729" Apr 16 20:17:32.677819 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.677786 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zm6qf"] Apr 16 20:17:32.681027 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.681010 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.684689 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.684669 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 20:17:32.684794 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.684693 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-qngt5\"" Apr 16 20:17:32.684967 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.684949 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 20:17:32.697256 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.697236 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zm6qf"] Apr 16 20:17:32.734350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.734328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.734460 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.734374 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5ec87aed-b64a-4af6-b7bb-398aea943bb7-cabundle0\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.734460 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.734408 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftc4s\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-kube-api-access-ftc4s\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.835505 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.835478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.835678 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.835534 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5ec87aed-b64a-4af6-b7bb-398aea943bb7-cabundle0\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.835678 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.835564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftc4s\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-kube-api-access-ftc4s\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.835678 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:32.835647 2566 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:32.835678 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:32.835665 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:32.835678 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:32.835677 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zm6qf: references non-existent secret key: ca.crt Apr 16 20:17:32.835911 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:32.835742 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates podName:5ec87aed-b64a-4af6-b7bb-398aea943bb7 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:33.335722037 +0000 UTC m=+337.775366985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates") pod "keda-operator-ffbb595cb-zm6qf" (UID: "5ec87aed-b64a-4af6-b7bb-398aea943bb7") : references non-existent secret key: ca.crt Apr 16 20:17:32.836293 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.836263 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5ec87aed-b64a-4af6-b7bb-398aea943bb7-cabundle0\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:32.848019 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:32.847991 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftc4s\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-kube-api-access-ftc4s\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:33.083396 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.083362 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld"] Apr 16 20:17:33.086817 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.086800 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.089838 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.089812 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 20:17:33.097773 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.097752 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld"] Apr 16 20:17:33.137081 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.137030 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj6vm\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-kube-api-access-zj6vm\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.137239 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.137101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/90b6983b-05a7-44cd-b838-219c8cea7ed1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.137239 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.137160 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.237711 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.237685 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj6vm\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-kube-api-access-zj6vm\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.237956 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.237925 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/90b6983b-05a7-44cd-b838-219c8cea7ed1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.238098 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.238084 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.238415 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.238393 2566 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:33.238510 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.238420 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:33.238510 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.238444 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld: references non-existent secret key: tls.crt Apr 16 20:17:33.238510 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.238502 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates podName:90b6983b-05a7-44cd-b838-219c8cea7ed1 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:33.738481995 +0000 UTC m=+338.178126954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates") pod "keda-metrics-apiserver-7c9f485588-7wmld" (UID: "90b6983b-05a7-44cd-b838-219c8cea7ed1") : references non-existent secret key: tls.crt Apr 16 20:17:33.238708 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.238680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/90b6983b-05a7-44cd-b838-219c8cea7ed1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.265203 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.265182 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj6vm\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-kube-api-access-zj6vm\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.338649 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.338592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:33.339080 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.338739 2566 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:33.339080 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.338753 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:33.339080 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.338762 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zm6qf: references non-existent secret key: ca.crt Apr 16 20:17:33.339080 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.338812 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates podName:5ec87aed-b64a-4af6-b7bb-398aea943bb7 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:34.338798393 +0000 UTC m=+338.778443335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates") pod "keda-operator-ffbb595cb-zm6qf" (UID: "5ec87aed-b64a-4af6-b7bb-398aea943bb7") : references non-existent secret key: ca.crt Apr 16 20:17:33.364337 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.364306 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6467f"] Apr 16 20:17:33.367926 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.367905 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.370292 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.370274 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 20:17:33.380103 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.380079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6467f"] Apr 16 20:17:33.439142 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.439117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57wg8\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-kube-api-access-57wg8\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.439320 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.439282 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.539787 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.539700 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.539787 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.539758 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57wg8\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-kube-api-access-57wg8\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.539996 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.539858 2566 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 20:17:33.539996 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.539888 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-6467f: secret "keda-admission-webhooks-certs" not found Apr 16 20:17:33.539996 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.539956 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates podName:2d43ebb1-0d99-4a08-887d-5a436e38ae63 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:34.039934533 +0000 UTC m=+338.479579497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates") pod "keda-admission-cf49989db-6467f" (UID: "2d43ebb1-0d99-4a08-887d-5a436e38ae63") : secret "keda-admission-webhooks-certs" not found Apr 16 20:17:33.551253 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.551215 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57wg8\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-kube-api-access-57wg8\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:33.741546 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:33.740894 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:33.741546 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.741085 2566 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:33.741546 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.741102 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:33.741546 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.741125 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld: references non-existent secret key: tls.crt Apr 16 20:17:33.741546 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:33.741182 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates podName:90b6983b-05a7-44cd-b838-219c8cea7ed1 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:34.741164469 +0000 UTC m=+339.180809416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates") pod "keda-metrics-apiserver-7c9f485588-7wmld" (UID: "90b6983b-05a7-44cd-b838-219c8cea7ed1") : references non-existent secret key: tls.crt Apr 16 20:17:34.043259 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.043226 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:34.045513 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.045493 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2d43ebb1-0d99-4a08-887d-5a436e38ae63-certificates\") pod \"keda-admission-cf49989db-6467f\" (UID: \"2d43ebb1-0d99-4a08-887d-5a436e38ae63\") " pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:34.279005 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.278970 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:34.345977 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.345896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:34.346451 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.346428 2566 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:34.346572 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.346455 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:34.346572 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.346468 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zm6qf: references non-existent secret key: ca.crt Apr 16 20:17:34.346572 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.346522 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates podName:5ec87aed-b64a-4af6-b7bb-398aea943bb7 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:36.346503548 +0000 UTC m=+340.786148491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates") pod "keda-operator-ffbb595cb-zm6qf" (UID: "5ec87aed-b64a-4af6-b7bb-398aea943bb7") : references non-existent secret key: ca.crt Apr 16 20:17:34.419992 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.419961 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6467f"] Apr 16 20:17:34.422344 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:17:34.422315 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d43ebb1_0d99_4a08_887d_5a436e38ae63.slice/crio-3355ec33725e646bf5235c7de510d66d2728e9b4401d1fc0c5a76225c1b3484d WatchSource:0}: Error finding container 3355ec33725e646bf5235c7de510d66d2728e9b4401d1fc0c5a76225c1b3484d: Status 404 returned error can't find the container with id 3355ec33725e646bf5235c7de510d66d2728e9b4401d1fc0c5a76225c1b3484d Apr 16 20:17:34.750151 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:34.750112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:34.750323 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.750301 2566 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:34.750404 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.750329 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:34.750404 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.750354 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld: references non-existent secret key: tls.crt Apr 16 20:17:34.750492 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:34.750426 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates podName:90b6983b-05a7-44cd-b838-219c8cea7ed1 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:36.750404095 +0000 UTC m=+341.190049052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates") pod "keda-metrics-apiserver-7c9f485588-7wmld" (UID: "90b6983b-05a7-44cd-b838-219c8cea7ed1") : references non-existent secret key: tls.crt Apr 16 20:17:35.328827 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:35.328790 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6467f" event={"ID":"2d43ebb1-0d99-4a08-887d-5a436e38ae63","Type":"ContainerStarted","Data":"3355ec33725e646bf5235c7de510d66d2728e9b4401d1fc0c5a76225c1b3484d"} Apr 16 20:17:36.332718 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:36.332685 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6467f" event={"ID":"2d43ebb1-0d99-4a08-887d-5a436e38ae63","Type":"ContainerStarted","Data":"679b8ac08bbdcdc88de8f7c8445f52cc6d6d86a153f89c5c03c0fce753aaf3d7"} Apr 16 20:17:36.333031 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:36.332810 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:17:36.351292 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:36.351243 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6467f" podStartSLOduration=1.531098039 podStartE2EDuration="3.351227308s" podCreationTimestamp="2026-04-16 20:17:33 +0000 UTC" firstStartedPulling="2026-04-16 20:17:34.423633901 +0000 UTC m=+338.863278843" lastFinishedPulling="2026-04-16 20:17:36.243763158 +0000 UTC m=+340.683408112" observedRunningTime="2026-04-16 20:17:36.349407166 +0000 UTC m=+340.789052140" watchObservedRunningTime="2026-04-16 20:17:36.351227308 +0000 UTC m=+340.790872274" Apr 16 20:17:36.364854 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:36.364830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:36.364973 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.364957 2566 secret.go:281] references non-existent secret key: ca.crt Apr 16 20:17:36.365008 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.364976 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 20:17:36.365008 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.364988 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-zm6qf: references non-existent secret key: ca.crt Apr 16 20:17:36.365078 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.365031 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates podName:5ec87aed-b64a-4af6-b7bb-398aea943bb7 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:40.36501801 +0000 UTC m=+344.804662957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates") pod "keda-operator-ffbb595cb-zm6qf" (UID: "5ec87aed-b64a-4af6-b7bb-398aea943bb7") : references non-existent secret key: ca.crt Apr 16 20:17:36.768670 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:36.768641 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:36.768818 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.768752 2566 secret.go:281] references non-existent secret key: tls.crt Apr 16 20:17:36.768818 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.768764 2566 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 20:17:36.768818 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.768780 2566 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld: references non-existent secret key: tls.crt Apr 16 20:17:36.768912 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:17:36.768828 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates podName:90b6983b-05a7-44cd-b838-219c8cea7ed1 nodeName:}" failed. No retries permitted until 2026-04-16 20:17:40.768814866 +0000 UTC m=+345.208459822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates") pod "keda-metrics-apiserver-7c9f485588-7wmld" (UID: "90b6983b-05a7-44cd-b838-219c8cea7ed1") : references non-existent secret key: tls.crt Apr 16 20:17:40.400027 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.399997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:40.402307 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.402279 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5ec87aed-b64a-4af6-b7bb-398aea943bb7-certificates\") pod \"keda-operator-ffbb595cb-zm6qf\" (UID: \"5ec87aed-b64a-4af6-b7bb-398aea943bb7\") " pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:40.490715 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.490691 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:40.606684 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.606653 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-zm6qf"] Apr 16 20:17:40.609371 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:17:40.609345 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec87aed_b64a_4af6_b7bb_398aea943bb7.slice/crio-80a3e734f540794880835492dbfe5252770c06a2c2c81a032c6ccb265d1eafe1 WatchSource:0}: Error finding container 80a3e734f540794880835492dbfe5252770c06a2c2c81a032c6ccb265d1eafe1: Status 404 returned error can't find the container with id 80a3e734f540794880835492dbfe5252770c06a2c2c81a032c6ccb265d1eafe1 Apr 16 20:17:40.803056 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.803028 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:40.805489 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.805466 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/90b6983b-05a7-44cd-b838-219c8cea7ed1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7wmld\" (UID: \"90b6983b-05a7-44cd-b838-219c8cea7ed1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:40.897468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:40.897434 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:41.015403 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:41.015372 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld"] Apr 16 20:17:41.018228 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:17:41.018188 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b6983b_05a7_44cd_b838_219c8cea7ed1.slice/crio-578a862a91e055d2447d9e3e7c30ec45d1002b3d938550948b82790a917b7d74 WatchSource:0}: Error finding container 578a862a91e055d2447d9e3e7c30ec45d1002b3d938550948b82790a917b7d74: Status 404 returned error can't find the container with id 578a862a91e055d2447d9e3e7c30ec45d1002b3d938550948b82790a917b7d74 Apr 16 20:17:41.346767 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:41.346726 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" event={"ID":"90b6983b-05a7-44cd-b838-219c8cea7ed1","Type":"ContainerStarted","Data":"578a862a91e055d2447d9e3e7c30ec45d1002b3d938550948b82790a917b7d74"} Apr 16 20:17:41.347730 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:41.347705 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" event={"ID":"5ec87aed-b64a-4af6-b7bb-398aea943bb7","Type":"ContainerStarted","Data":"80a3e734f540794880835492dbfe5252770c06a2c2c81a032c6ccb265d1eafe1"} Apr 16 20:17:45.361306 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.361272 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" event={"ID":"5ec87aed-b64a-4af6-b7bb-398aea943bb7","Type":"ContainerStarted","Data":"2738a67b7f7fea34e4d8c550bb0785ba1ed986eb259875bd7e7807be73616882"} Apr 16 20:17:45.361772 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.361336 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:17:45.362583 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.362552 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" event={"ID":"90b6983b-05a7-44cd-b838-219c8cea7ed1","Type":"ContainerStarted","Data":"dd9fc41696fe9a721c17669897180b1856070f9a9d4fc504bf5a96b0cdfc8449"} Apr 16 20:17:45.362725 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.362642 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:45.379196 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.379154 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" podStartSLOduration=9.080208877 podStartE2EDuration="13.379140884s" podCreationTimestamp="2026-04-16 20:17:32 +0000 UTC" firstStartedPulling="2026-04-16 20:17:40.61068233 +0000 UTC m=+345.050327275" lastFinishedPulling="2026-04-16 20:17:44.909614328 +0000 UTC m=+349.349259282" observedRunningTime="2026-04-16 20:17:45.377585807 +0000 UTC m=+349.817230771" watchObservedRunningTime="2026-04-16 20:17:45.379140884 +0000 UTC m=+349.818785848" Apr 16 20:17:45.393189 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:45.393139 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" podStartSLOduration=8.502795256 podStartE2EDuration="12.393127043s" podCreationTimestamp="2026-04-16 20:17:33 +0000 UTC" firstStartedPulling="2026-04-16 20:17:41.019587561 +0000 UTC m=+345.459232503" lastFinishedPulling="2026-04-16 20:17:44.909919339 +0000 UTC m=+349.349564290" observedRunningTime="2026-04-16 20:17:45.392433163 +0000 UTC m=+349.832078126" watchObservedRunningTime="2026-04-16 20:17:45.393127043 +0000 UTC m=+349.832772082" Apr 16 20:17:53.323221 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:53.323191 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6j5rw" Apr 16 20:17:56.370119 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:56.370086 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7wmld" Apr 16 20:17:57.337843 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:17:57.337816 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6467f" Apr 16 20:18:06.368074 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:06.368040 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-zm6qf" Apr 16 20:18:41.200350 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.200318 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:18:41.203470 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.203453 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.210550 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.210530 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:18:41.210550 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.210542 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:18:41.210734 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.210533 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 20:18:41.210734 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.210645 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-5s9t7\"" Apr 16 20:18:41.214717 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.212724 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf"] Apr 16 20:18:41.216828 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.216802 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.219491 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.219472 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-tp9hl\"" Apr 16 20:18:41.219986 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.219856 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 20:18:41.221577 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.221553 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:18:41.227365 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.227346 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf"] Apr 16 20:18:41.244159 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.244136 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-z5btx"] Apr 16 20:18:41.247218 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.247204 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.249291 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.249271 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:18:41.249369 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.249271 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:18:41.262250 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262230 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kkh7\" (UniqueName: \"kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.262360 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262267 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a9614e8a-875a-4cb7-a654-7b6658d8f950-data\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.262360 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262287 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.262591 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.262591 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262417 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7b9k\" (UniqueName: \"kubernetes.io/projected/8a3a4266-af45-4025-b06d-d54259c77a73-kube-api-access-r7b9k\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.262591 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262426 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-z5btx"] Apr 16 20:18:41.262591 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.262447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-948rx\" (UniqueName: \"kubernetes.io/projected/a9614e8a-875a-4cb7-a654-7b6658d8f950-kube-api-access-948rx\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.363337 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363308 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.363468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363350 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7b9k\" (UniqueName: \"kubernetes.io/projected/8a3a4266-af45-4025-b06d-d54259c77a73-kube-api-access-r7b9k\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.363468 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363376 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-948rx\" (UniqueName: \"kubernetes.io/projected/a9614e8a-875a-4cb7-a654-7b6658d8f950-kube-api-access-948rx\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.363546 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kkh7\" (UniqueName: \"kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.363595 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363544 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a9614e8a-875a-4cb7-a654-7b6658d8f950-data\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.363595 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.363731 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:18:41.363712 2566 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 20:18:41.363803 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:18:41.363790 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert podName:8a3a4266-af45-4025-b06d-d54259c77a73 nodeName:}" failed. No retries permitted until 2026-04-16 20:18:41.863768402 +0000 UTC m=+406.303413359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert") pod "llmisvc-controller-manager-68cc5db7c4-q2xlf" (UID: "8a3a4266-af45-4025-b06d-d54259c77a73") : secret "llmisvc-webhook-server-cert" not found Apr 16 20:18:41.363866 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.363847 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a9614e8a-875a-4cb7-a654-7b6658d8f950-data\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.365765 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.365745 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.376659 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.376635 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-948rx\" (UniqueName: \"kubernetes.io/projected/a9614e8a-875a-4cb7-a654-7b6658d8f950-kube-api-access-948rx\") pod \"seaweedfs-86cc847c5c-z5btx\" (UID: \"a9614e8a-875a-4cb7-a654-7b6658d8f950\") " pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.376989 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.376968 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7b9k\" (UniqueName: \"kubernetes.io/projected/8a3a4266-af45-4025-b06d-d54259c77a73-kube-api-access-r7b9k\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.379182 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.379157 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kkh7\" (UniqueName: \"kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7\") pod \"kserve-controller-manager-66cf78b85b-nwhfk\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.518772 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.518747 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:41.555882 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.555854 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:41.643192 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.643160 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:18:41.647169 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:18:41.647141 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83c8e6ff_26c2_4c21_9bb1_dbbae233db87.slice/crio-7abd4fb71c0fef4269e632f124f00cff10cc9d9ed20cca64cb5a2b96c9c5898a WatchSource:0}: Error finding container 7abd4fb71c0fef4269e632f124f00cff10cc9d9ed20cca64cb5a2b96c9c5898a: Status 404 returned error can't find the container with id 7abd4fb71c0fef4269e632f124f00cff10cc9d9ed20cca64cb5a2b96c9c5898a Apr 16 20:18:41.687085 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.686969 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-z5btx"] Apr 16 20:18:41.689047 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:18:41.689012 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9614e8a_875a_4cb7_a654_7b6658d8f950.slice/crio-b944a37a049cf139dbb4aeffddf4fc20f4c23938ffefda687076b1a2886e4c55 WatchSource:0}: Error finding container b944a37a049cf139dbb4aeffddf4fc20f4c23938ffefda687076b1a2886e4c55: Status 404 returned error can't find the container with id b944a37a049cf139dbb4aeffddf4fc20f4c23938ffefda687076b1a2886e4c55 Apr 16 20:18:41.868775 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.868699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:41.870967 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:41.870949 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a3a4266-af45-4025-b06d-d54259c77a73-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-q2xlf\" (UID: \"8a3a4266-af45-4025-b06d-d54259c77a73\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:42.127974 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:42.127896 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:42.298232 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:42.298201 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf"] Apr 16 20:18:42.415470 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:18:42.415393 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8a3a4266_af45_4025_b06d_d54259c77a73.slice/crio-9395b61d0d31e6797b195f8f8fb7a26dbe740065a460890469a0feed69ea13d5 WatchSource:0}: Error finding container 9395b61d0d31e6797b195f8f8fb7a26dbe740065a460890469a0feed69ea13d5: Status 404 returned error can't find the container with id 9395b61d0d31e6797b195f8f8fb7a26dbe740065a460890469a0feed69ea13d5 Apr 16 20:18:42.524712 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:42.524634 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" event={"ID":"8a3a4266-af45-4025-b06d-d54259c77a73","Type":"ContainerStarted","Data":"9395b61d0d31e6797b195f8f8fb7a26dbe740065a460890469a0feed69ea13d5"} Apr 16 20:18:42.526592 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:42.526551 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-z5btx" event={"ID":"a9614e8a-875a-4cb7-a654-7b6658d8f950","Type":"ContainerStarted","Data":"b944a37a049cf139dbb4aeffddf4fc20f4c23938ffefda687076b1a2886e4c55"} Apr 16 20:18:42.528088 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:42.528049 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" event={"ID":"83c8e6ff-26c2-4c21-9bb1-dbbae233db87","Type":"ContainerStarted","Data":"7abd4fb71c0fef4269e632f124f00cff10cc9d9ed20cca64cb5a2b96c9c5898a"} Apr 16 20:18:46.548035 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.547993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-z5btx" event={"ID":"a9614e8a-875a-4cb7-a654-7b6658d8f950","Type":"ContainerStarted","Data":"43a477c6ab93fed627571509e4c6706e5739b075ad31df440d394fb0d3eb0f14"} Apr 16 20:18:46.548413 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.548209 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:18:46.549487 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.549461 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" event={"ID":"83c8e6ff-26c2-4c21-9bb1-dbbae233db87","Type":"ContainerStarted","Data":"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730"} Apr 16 20:18:46.549620 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.549590 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:18:46.568166 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.568015 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-z5btx" podStartSLOduration=0.795729922 podStartE2EDuration="5.567996657s" podCreationTimestamp="2026-04-16 20:18:41 +0000 UTC" firstStartedPulling="2026-04-16 20:18:41.690279971 +0000 UTC m=+406.129924913" lastFinishedPulling="2026-04-16 20:18:46.462546705 +0000 UTC m=+410.902191648" observedRunningTime="2026-04-16 20:18:46.565563688 +0000 UTC m=+411.005208663" watchObservedRunningTime="2026-04-16 20:18:46.567996657 +0000 UTC m=+411.007641622" Apr 16 20:18:46.583818 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:46.583778 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" podStartSLOduration=1.9316159339999999 podStartE2EDuration="5.58376083s" podCreationTimestamp="2026-04-16 20:18:41 +0000 UTC" firstStartedPulling="2026-04-16 20:18:41.648923836 +0000 UTC m=+406.088568781" lastFinishedPulling="2026-04-16 20:18:45.301068723 +0000 UTC m=+409.740713677" observedRunningTime="2026-04-16 20:18:46.581900673 +0000 UTC m=+411.021545636" watchObservedRunningTime="2026-04-16 20:18:46.58376083 +0000 UTC m=+411.023405848" Apr 16 20:18:47.553369 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:47.553331 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" event={"ID":"8a3a4266-af45-4025-b06d-d54259c77a73","Type":"ContainerStarted","Data":"080210085f3fa1cae66955729b4e6cb096db484a1cdf044b8eee0485097e8f35"} Apr 16 20:18:47.570339 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:47.570300 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" podStartSLOduration=2.467869287 podStartE2EDuration="6.570287487s" podCreationTimestamp="2026-04-16 20:18:41 +0000 UTC" firstStartedPulling="2026-04-16 20:18:42.417822906 +0000 UTC m=+406.857467861" lastFinishedPulling="2026-04-16 20:18:46.520241102 +0000 UTC m=+410.959886061" observedRunningTime="2026-04-16 20:18:47.569223585 +0000 UTC m=+412.008868549" watchObservedRunningTime="2026-04-16 20:18:47.570287487 +0000 UTC m=+412.009932445" Apr 16 20:18:48.556544 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:48.556509 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:18:52.555947 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:18:52.555919 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-z5btx" Apr 16 20:19:17.559178 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:17.559088 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:19:19.561836 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:19.561797 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-q2xlf" Apr 16 20:19:20.864827 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.864792 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:19:20.865252 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.865021 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" podUID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" containerName="manager" containerID="cri-o://797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730" gracePeriod=10 Apr 16 20:19:20.913313 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.913286 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-6rj6g"] Apr 16 20:19:20.915460 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.915445 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:20.918264 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.918239 2566 status_manager.go:895] "Failed to get status for pod" podUID="22cafa07-f4f8-4b2e-80c3-9a953a924de1" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" err="pods \"kserve-controller-manager-66cf78b85b-6rj6g\" is forbidden: User \"system:node:ip-10-0-142-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve\": no relationship found between node 'ip-10-0-142-60.ec2.internal' and this object" Apr 16 20:19:20.939548 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:20.939529 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-6rj6g"] Apr 16 20:19:21.081056 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.081024 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cafa07-f4f8-4b2e-80c3-9a953a924de1-cert\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.081204 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.081071 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j44\" (UniqueName: \"kubernetes.io/projected/22cafa07-f4f8-4b2e-80c3-9a953a924de1-kube-api-access-d2j44\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.098828 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.098810 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:19:21.182361 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.182290 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert\") pod \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " Apr 16 20:19:21.182361 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.182338 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kkh7\" (UniqueName: \"kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7\") pod \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\" (UID: \"83c8e6ff-26c2-4c21-9bb1-dbbae233db87\") " Apr 16 20:19:21.182513 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.182491 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cafa07-f4f8-4b2e-80c3-9a953a924de1-cert\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.182549 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.182524 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j44\" (UniqueName: \"kubernetes.io/projected/22cafa07-f4f8-4b2e-80c3-9a953a924de1-kube-api-access-d2j44\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.184399 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.184373 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert" (OuterVolumeSpecName: "cert") pod "83c8e6ff-26c2-4c21-9bb1-dbbae233db87" (UID: "83c8e6ff-26c2-4c21-9bb1-dbbae233db87"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:19:21.184399 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.184378 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7" (OuterVolumeSpecName: "kube-api-access-4kkh7") pod "83c8e6ff-26c2-4c21-9bb1-dbbae233db87" (UID: "83c8e6ff-26c2-4c21-9bb1-dbbae233db87"). InnerVolumeSpecName "kube-api-access-4kkh7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:21.184802 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.184787 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cafa07-f4f8-4b2e-80c3-9a953a924de1-cert\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.190720 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.190696 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j44\" (UniqueName: \"kubernetes.io/projected/22cafa07-f4f8-4b2e-80c3-9a953a924de1-kube-api-access-d2j44\") pod \"kserve-controller-manager-66cf78b85b-6rj6g\" (UID: \"22cafa07-f4f8-4b2e-80c3-9a953a924de1\") " pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.240595 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.240556 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:21.283106 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.283075 2566 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-cert\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.283106 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.283105 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4kkh7\" (UniqueName: \"kubernetes.io/projected/83c8e6ff-26c2-4c21-9bb1-dbbae233db87-kube-api-access-4kkh7\") on node \"ip-10-0-142-60.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.359409 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.359365 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-6rj6g"] Apr 16 20:19:21.361911 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:19:21.361883 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22cafa07_f4f8_4b2e_80c3_9a953a924de1.slice/crio-118ce10679fee34ab9fdad09311778d9ea46fd34a4ca9e31275ffe29a47e0712 WatchSource:0}: Error finding container 118ce10679fee34ab9fdad09311778d9ea46fd34a4ca9e31275ffe29a47e0712: Status 404 returned error can't find the container with id 118ce10679fee34ab9fdad09311778d9ea46fd34a4ca9e31275ffe29a47e0712 Apr 16 20:19:21.658423 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.658392 2566 generic.go:358] "Generic (PLEG): container finished" podID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" containerID="797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730" exitCode=0 Apr 16 20:19:21.658586 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.658456 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" Apr 16 20:19:21.658586 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.658481 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" event={"ID":"83c8e6ff-26c2-4c21-9bb1-dbbae233db87","Type":"ContainerDied","Data":"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730"} Apr 16 20:19:21.658586 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.658516 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-nwhfk" event={"ID":"83c8e6ff-26c2-4c21-9bb1-dbbae233db87","Type":"ContainerDied","Data":"7abd4fb71c0fef4269e632f124f00cff10cc9d9ed20cca64cb5a2b96c9c5898a"} Apr 16 20:19:21.658586 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.658533 2566 scope.go:117] "RemoveContainer" containerID="797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730" Apr 16 20:19:21.659547 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.659485 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" event={"ID":"22cafa07-f4f8-4b2e-80c3-9a953a924de1","Type":"ContainerStarted","Data":"118ce10679fee34ab9fdad09311778d9ea46fd34a4ca9e31275ffe29a47e0712"} Apr 16 20:19:21.666409 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.666394 2566 scope.go:117] "RemoveContainer" containerID="797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730" Apr 16 20:19:21.669017 ip-10-0-142-60 kubenswrapper[2566]: E0416 20:19:21.667118 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730\": container with ID starting with 797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730 not found: ID does not exist" containerID="797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730" Apr 16 20:19:21.669017 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.667149 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730"} err="failed to get container status \"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730\": rpc error: code = NotFound desc = could not find container \"797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730\": container with ID starting with 797f7c923f1d8c9e331a0c95c71106b0e75130664b48771d7b12c9c7da6a6730 not found: ID does not exist" Apr 16 20:19:21.679262 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.679238 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:19:21.681877 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:21.681856 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-nwhfk"] Apr 16 20:19:22.248755 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:22.248724 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" path="/var/lib/kubelet/pods/83c8e6ff-26c2-4c21-9bb1-dbbae233db87/volumes" Apr 16 20:19:22.664481 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:22.664402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" event={"ID":"22cafa07-f4f8-4b2e-80c3-9a953a924de1","Type":"ContainerStarted","Data":"9b61e07abc6b4577fda0df312e65e615ab6e82486dd31bae5c4818f4bf24d56f"} Apr 16 20:19:22.664664 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:22.664514 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:22.679365 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:22.679322 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" podStartSLOduration=2.262992375 podStartE2EDuration="2.679309519s" podCreationTimestamp="2026-04-16 20:19:20 +0000 UTC" firstStartedPulling="2026-04-16 20:19:21.363171524 +0000 UTC m=+445.802816465" lastFinishedPulling="2026-04-16 20:19:21.779488659 +0000 UTC m=+446.219133609" observedRunningTime="2026-04-16 20:19:22.678774345 +0000 UTC m=+447.118419309" watchObservedRunningTime="2026-04-16 20:19:22.679309519 +0000 UTC m=+447.118954482" Apr 16 20:19:53.672235 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:53.672205 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-6rj6g" Apr 16 20:19:54.657783 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.657752 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-2cmx6"] Apr 16 20:19:54.658054 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.658043 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" containerName="manager" Apr 16 20:19:54.658106 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.658056 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" containerName="manager" Apr 16 20:19:54.658139 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.658119 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="83c8e6ff-26c2-4c21-9bb1-dbbae233db87" containerName="manager" Apr 16 20:19:54.659964 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.659948 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.662278 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.662257 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dwwvl\"" Apr 16 20:19:54.662405 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.662295 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 20:19:54.669084 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.669065 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2cmx6"] Apr 16 20:19:54.736410 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.736382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-792sx\" (UniqueName: \"kubernetes.io/projected/af940349-91ed-4fcc-b6f9-84cd2c0a7967-kube-api-access-792sx\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.736743 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.736430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af940349-91ed-4fcc-b6f9-84cd2c0a7967-tls-certs\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.837662 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.837635 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-792sx\" (UniqueName: \"kubernetes.io/projected/af940349-91ed-4fcc-b6f9-84cd2c0a7967-kube-api-access-792sx\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.837793 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.837679 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af940349-91ed-4fcc-b6f9-84cd2c0a7967-tls-certs\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.840072 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.840047 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af940349-91ed-4fcc-b6f9-84cd2c0a7967-tls-certs\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.846247 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.846226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-792sx\" (UniqueName: \"kubernetes.io/projected/af940349-91ed-4fcc-b6f9-84cd2c0a7967-kube-api-access-792sx\") pod \"model-serving-api-86f7b4b499-2cmx6\" (UID: \"af940349-91ed-4fcc-b6f9-84cd2c0a7967\") " pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:54.972752 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:54.972669 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:55.091477 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:55.091453 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-2cmx6"] Apr 16 20:19:55.094215 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:19:55.094186 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf940349_91ed_4fcc_b6f9_84cd2c0a7967.slice/crio-23452e3c1d449902631dddbeed6ca35092eb13b8382ad42497e339914c80d5bc WatchSource:0}: Error finding container 23452e3c1d449902631dddbeed6ca35092eb13b8382ad42497e339914c80d5bc: Status 404 returned error can't find the container with id 23452e3c1d449902631dddbeed6ca35092eb13b8382ad42497e339914c80d5bc Apr 16 20:19:55.775847 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:55.775811 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2cmx6" event={"ID":"af940349-91ed-4fcc-b6f9-84cd2c0a7967","Type":"ContainerStarted","Data":"23452e3c1d449902631dddbeed6ca35092eb13b8382ad42497e339914c80d5bc"} Apr 16 20:19:56.780024 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:56.779984 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-2cmx6" event={"ID":"af940349-91ed-4fcc-b6f9-84cd2c0a7967","Type":"ContainerStarted","Data":"258e0b7376582616567f9ea40b0bf3c8c1962e379d993441cf856c21801d8ae1"} Apr 16 20:19:56.780394 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:56.780117 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:19:56.795206 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:19:56.795165 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-2cmx6" podStartSLOduration=1.6431132210000001 podStartE2EDuration="2.79515173s" podCreationTimestamp="2026-04-16 20:19:54 +0000 UTC" firstStartedPulling="2026-04-16 20:19:55.096304374 +0000 UTC m=+479.535949329" lastFinishedPulling="2026-04-16 20:19:56.248342896 +0000 UTC m=+480.687987838" observedRunningTime="2026-04-16 20:19:56.79486048 +0000 UTC m=+481.234505464" watchObservedRunningTime="2026-04-16 20:19:56.79515173 +0000 UTC m=+481.234796693" Apr 16 20:20:07.787698 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:07.787652 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-2cmx6" Apr 16 20:20:40.970110 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:40.970078 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9"] Apr 16 20:20:40.972409 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:40.972394 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:40.974638 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:40.974619 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 20:20:40.974749 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:40.974663 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 20:20:40.980436 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:40.980415 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9"] Apr 16 20:20:41.008195 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.008171 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.008313 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.008200 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.008354 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.008319 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxs84\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-kube-api-access-qxs84\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.109199 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.109173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxs84\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-kube-api-access-qxs84\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.109322 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.109221 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.109361 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.109341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.109542 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.109526 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-data\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.111584 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.111560 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.124808 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.124785 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxs84\" (UniqueName: \"kubernetes.io/projected/342f9628-2fd1-4b9e-8f66-b308c08c4fd8-kube-api-access-qxs84\") pod \"seaweedfs-tls-serving-7fd5766db9-7rwd9\" (UID: \"342f9628-2fd1-4b9e-8f66-b308c08c4fd8\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.282206 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.282179 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" Apr 16 20:20:41.396207 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.396176 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9"] Apr 16 20:20:41.400122 ip-10-0-142-60 kubenswrapper[2566]: W0416 20:20:41.400098 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342f9628_2fd1_4b9e_8f66_b308c08c4fd8.slice/crio-4870b198f2bff098f00cf56283f16e21ea335547e68abd50f0b184d0da73fa51 WatchSource:0}: Error finding container 4870b198f2bff098f00cf56283f16e21ea335547e68abd50f0b184d0da73fa51: Status 404 returned error can't find the container with id 4870b198f2bff098f00cf56283f16e21ea335547e68abd50f0b184d0da73fa51 Apr 16 20:20:41.923650 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.923546 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" event={"ID":"342f9628-2fd1-4b9e-8f66-b308c08c4fd8","Type":"ContainerStarted","Data":"dc9ab4b20f6cbc6ae319d6bd0cd7b641b22f38430cb02436b92ac02eaeaa2977"} Apr 16 20:20:41.923650 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.923585 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" event={"ID":"342f9628-2fd1-4b9e-8f66-b308c08c4fd8","Type":"ContainerStarted","Data":"4870b198f2bff098f00cf56283f16e21ea335547e68abd50f0b184d0da73fa51"} Apr 16 20:20:41.940406 ip-10-0-142-60 kubenswrapper[2566]: I0416 20:20:41.940361 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-7rwd9" podStartSLOduration=1.673159655 podStartE2EDuration="1.94034774s" podCreationTimestamp="2026-04-16 20:20:40 +0000 UTC" firstStartedPulling="2026-04-16 20:20:41.401257694 +0000 UTC m=+525.840902636" lastFinishedPulling="2026-04-16 20:20:41.668445776 +0000 UTC m=+526.108090721" observedRunningTime="2026-04-16 20:20:41.937942263 +0000 UTC m=+526.377587226" watchObservedRunningTime="2026-04-16 20:20:41.94034774 +0000 UTC m=+526.379992703" Apr 16 21:12:03.154484 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:03.154455 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kpb78_8a902087-f546-42a1-b9a5-96dab151ae99/global-pull-secret-syncer/0.log" Apr 16 21:12:03.206120 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:03.206093 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kzrvj_a90e7ceb-6160-4490-81d3-0bf334a5861e/konnectivity-agent/0.log" Apr 16 21:12:03.370084 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:03.370052 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-60.ec2.internal_4950d98dd76e149e921ac7e36cab051b/haproxy/0.log" Apr 16 21:12:06.764458 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:06.764426 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lcndm_a5125243-6f0d-4b3b-a7dc-3c481a10fdcb/cluster-monitoring-operator/0.log" Apr 16 21:12:07.081230 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.081158 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlml2_d0f052db-d4c8-42a7-8862-1360fad89eb4/node-exporter/0.log" Apr 16 21:12:07.101691 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.101665 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlml2_d0f052db-d4c8-42a7-8862-1360fad89eb4/kube-rbac-proxy/0.log" Apr 16 21:12:07.124965 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.124945 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hlml2_d0f052db-d4c8-42a7-8862-1360fad89eb4/init-textfile/0.log" Apr 16 21:12:07.407521 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.407447 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s7xs5_a07d554b-3b76-42a4-90f0-795d65c8a58a/prometheus-operator/0.log" Apr 16 21:12:07.425177 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.425155 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-s7xs5_a07d554b-3b76-42a4-90f0-795d65c8a58a/kube-rbac-proxy/0.log" Apr 16 21:12:07.488364 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.488323 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fb6b49696-77fxj_8045ea37-6b10-43e2-90c7-50ee6a8ba3f3/telemeter-client/0.log" Apr 16 21:12:07.511754 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.511730 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fb6b49696-77fxj_8045ea37-6b10-43e2-90c7-50ee6a8ba3f3/reload/0.log" Apr 16 21:12:07.535877 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:07.535858 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fb6b49696-77fxj_8045ea37-6b10-43e2-90c7-50ee6a8ba3f3/kube-rbac-proxy/0.log" Apr 16 21:12:09.653953 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:09.653920 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xg7v7_2a8395f1-29fd-4e8d-aead-f6211089cc8d/download-server/0.log" Apr 16 21:12:10.030397 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.030350 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-j6vbj_0a787dc5-7cdd-40df-bd39-3b2a23ba5aa7/volume-data-source-validator/0.log" Apr 16 21:12:10.098224 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.098193 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28"] Apr 16 21:12:10.101575 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.101560 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.103868 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.103845 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"kube-root-ca.crt\"" Apr 16 21:12:10.103868 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.103865 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"openshift-service-ca.crt\"" Apr 16 21:12:10.104544 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.104529 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g6jdf\"/\"default-dockercfg-w7zrq\"" Apr 16 21:12:10.109416 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.109394 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28"] Apr 16 21:12:10.220780 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.220745 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-lib-modules\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.220974 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.220855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djpbw\" (UniqueName: \"kubernetes.io/projected/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-kube-api-access-djpbw\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.220974 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.220908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-sys\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.220974 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.220931 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-proc\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.220974 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.220956 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-podres\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322416 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322324 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djpbw\" (UniqueName: \"kubernetes.io/projected/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-kube-api-access-djpbw\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322416 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-sys\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322416 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-proc\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322416 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322424 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-podres\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322768 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322475 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-sys\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322768 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-lib-modules\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322768 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322542 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-proc\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322768 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322586 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-podres\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.322768 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.322664 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-lib-modules\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.329959 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.329934 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djpbw\" (UniqueName: \"kubernetes.io/projected/3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9-kube-api-access-djpbw\") pod \"perf-node-gather-daemonset-tth28\" (UID: \"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.411449 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.411421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:10.532011 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.531981 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28"] Apr 16 21:12:10.537917 ip-10-0-142-60 kubenswrapper[2566]: W0416 21:12:10.537887 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b67ed7d_1613_4d0d_8e25_8f7dd3ff92a9.slice/crio-8060ca899b21761c4db5f1f7df503fac67ca4d4bcb73e5eb41fb177a9fd8a42a WatchSource:0}: Error finding container 8060ca899b21761c4db5f1f7df503fac67ca4d4bcb73e5eb41fb177a9fd8a42a: Status 404 returned error can't find the container with id 8060ca899b21761c4db5f1f7df503fac67ca4d4bcb73e5eb41fb177a9fd8a42a Apr 16 21:12:10.539359 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.539337 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:12:10.610744 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.610713 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" event={"ID":"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9","Type":"ContainerStarted","Data":"8060ca899b21761c4db5f1f7df503fac67ca4d4bcb73e5eb41fb177a9fd8a42a"} Apr 16 21:12:10.744533 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.744507 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k5t5f_96e95540-055f-454d-b85c-31093fbd7bbf/dns/0.log" Apr 16 21:12:10.768819 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.768791 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k5t5f_96e95540-055f-454d-b85c-31093fbd7bbf/kube-rbac-proxy/0.log" Apr 16 21:12:10.906649 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:10.906542 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q76dr_87948db0-f0f9-46ff-ad52-0b6cb7a17f42/dns-node-resolver/0.log" Apr 16 21:12:11.395834 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:11.395800 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5wvp4_3fe6aa55-9c5e-4ed7-bd31-4790e51c271b/node-ca/0.log" Apr 16 21:12:11.614764 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:11.614730 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" event={"ID":"3b67ed7d-1613-4d0d-8e25-8f7dd3ff92a9","Type":"ContainerStarted","Data":"aadd37a3e7bf719c01e2a964f60b416b1b9afc02284ec1fbd504615d07d92d79"} Apr 16 21:12:11.614949 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:11.614890 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:11.631458 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:11.631414 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" podStartSLOduration=1.631400733 podStartE2EDuration="1.631400733s" podCreationTimestamp="2026-04-16 21:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:12:11.629928487 +0000 UTC m=+3616.069573461" watchObservedRunningTime="2026-04-16 21:12:11.631400733 +0000 UTC m=+3616.071045697" Apr 16 21:12:12.180242 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:12.180208 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b64899dcd-25ctw_3d1461df-f676-49a3-a685-4bddd22c8287/router/0.log" Apr 16 21:12:12.565654 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:12.565626 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t7fts_42bbcb75-7cbe-482c-8c08-a9ceeb1c626d/serve-healthcheck-canary/0.log" Apr 16 21:12:12.905076 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:12.904994 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-q5swj_387f5caa-46e7-4c7e-9eb3-9fececd0858d/insights-operator/0.log" Apr 16 21:12:12.905218 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:12.905198 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-q5swj_387f5caa-46e7-4c7e-9eb3-9fececd0858d/insights-operator/1.log" Apr 16 21:12:13.065808 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:13.065771 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x9xwk_025e9cf8-7d1b-4745-92f1-6017cc3167d6/kube-rbac-proxy/0.log" Apr 16 21:12:13.086804 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:13.086772 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x9xwk_025e9cf8-7d1b-4745-92f1-6017cc3167d6/exporter/0.log" Apr 16 21:12:13.110291 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:13.110256 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x9xwk_025e9cf8-7d1b-4745-92f1-6017cc3167d6/extractor/0.log" Apr 16 21:12:15.061654 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:15.061623 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-6rj6g_22cafa07-f4f8-4b2e-80c3-9a953a924de1/manager/0.log" Apr 16 21:12:15.081180 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:15.081159 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-q2xlf_8a3a4266-af45-4025-b06d-d54259c77a73/manager/0.log" Apr 16 21:12:15.103962 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:15.103945 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-2cmx6_af940349-91ed-4fcc-b6f9-84cd2c0a7967/server/0.log" Apr 16 21:12:15.403857 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:15.403783 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-z5btx_a9614e8a-875a-4cb7-a654-7b6658d8f950/seaweedfs/0.log" Apr 16 21:12:15.453088 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:15.453060 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-7rwd9_342f9628-2fd1-4b9e-8f66-b308c08c4fd8/seaweedfs-tls-serving/0.log" Apr 16 21:12:17.627766 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:17.627734 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-tth28" Apr 16 21:12:20.682086 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.682062 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/kube-multus-additional-cni-plugins/0.log" Apr 16 21:12:20.747817 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.747798 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/egress-router-binary-copy/0.log" Apr 16 21:12:20.786632 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.786589 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/cni-plugins/0.log" Apr 16 21:12:20.808652 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.808627 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/bond-cni-plugin/0.log" Apr 16 21:12:20.829853 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.829829 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/routeoverride-cni/0.log" Apr 16 21:12:20.850939 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.850918 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/whereabouts-cni-bincopy/0.log" Apr 16 21:12:20.872113 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:20.872076 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4llv8_6321d139-42ba-4ad4-96d3-6dafabbdc869/whereabouts-cni/0.log" Apr 16 21:12:21.296245 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:21.296173 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s4bbh_aae5927d-11b8-46a5-a3e9-c3be8d357974/kube-multus/0.log" Apr 16 21:12:21.405709 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:21.405667 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jdfnl_02d874be-6206-4feb-99d1-3539318d290b/network-metrics-daemon/0.log" Apr 16 21:12:21.428247 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:21.428220 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jdfnl_02d874be-6206-4feb-99d1-3539318d290b/kube-rbac-proxy/0.log" Apr 16 21:12:22.148253 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.148225 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/ovn-controller/0.log" Apr 16 21:12:22.186392 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.186370 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/ovn-acl-logging/0.log" Apr 16 21:12:22.206467 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.206443 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/kube-rbac-proxy-node/0.log" Apr 16 21:12:22.229854 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.229824 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:12:22.250429 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.250411 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/northd/0.log" Apr 16 21:12:22.271883 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.271861 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/nbdb/0.log" Apr 16 21:12:22.293183 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.293155 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/sbdb/0.log" Apr 16 21:12:22.398586 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:22.398508 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8lrq8_9d24ecb8-b036-4a27-8ff3-1283740a16d5/ovnkube-controller/0.log" Apr 16 21:12:24.153985 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:24.153949 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tkpmf_3713fd95-2eeb-4bfb-9189-fc6bf6ca4b38/network-check-target-container/0.log" Apr 16 21:12:25.026034 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:25.025998 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qmblv_88690dc9-b75a-4009-b53f-717dd6e43bda/iptables-alerter/0.log" Apr 16 21:12:25.654780 ip-10-0-142-60 kubenswrapper[2566]: I0416 21:12:25.654747 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-68ppm_3b621d74-7f5b-47e2-afbb-a2c1610adc49/tuned/0.log"