Apr 22 15:58:35.448394 ip-10-0-135-152 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:58:35.882775 ip-10-0-135-152 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:35.882775 ip-10-0-135-152 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:58:35.882775 ip-10-0-135-152 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:35.882775 ip-10-0-135-152 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:58:35.882775 ip-10-0-135-152 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:58:35.884528 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.884444 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889234 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889254 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889257 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889261 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889264 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889267 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:35.889261 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889270 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889273 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889276 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889281 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889286 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889289 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889292 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889295 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889298 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889301 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889304 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889307 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889310 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889312 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889315 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889318 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889322 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889325 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889328 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:35.889504 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889331 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889334 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889337 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889339 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889342 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889345 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889348 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889351 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889354 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889356 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889359 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889361 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889364 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889366 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889369 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889372 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889375 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889377 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889380 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889383 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:35.889990 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889385 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889388 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889390 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889393 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889395 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889398 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889400 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889403 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889406 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889408 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889411 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889414 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889416 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889419 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889421 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889423 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889426 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889429 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889431 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889434 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:35.890514 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889436 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889439 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889441 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889444 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889447 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889449 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889451 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889455 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889458 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889461 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889463 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889466 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889468 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889476 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889478 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889481 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889483 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889486 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889489 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889491 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:35.890996 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889494 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889888 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889893 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889896 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889898 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889901 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889903 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889906 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889909 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889911 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889914 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889917 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889919 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889923 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889926 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889930 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889933 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889936 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889939 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:35.891489 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889943 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889946 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889949 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889952 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889954 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889957 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889960 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889963 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889965 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889968 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889970 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889972 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889975 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889977 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889980 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889982 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889985 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889987 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889990 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889992 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:35.891943 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889995 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.889997 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890000 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890002 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890005 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890007 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890010 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890012 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890015 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890018 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890020 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890022 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890027 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890030 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890032 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890035 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890038 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890040 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890042 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890045 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:35.892462 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890048 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890050 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890053 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890056 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890058 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890060 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890063 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890065 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890068 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890070 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890073 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890077 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890081 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890102 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890114 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890117 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890119 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890122 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890124 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890128 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:35.892946 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890131 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890133 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890135 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890138 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890141 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890144 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890146 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.890149 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891652 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891661 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891668 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891672 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891677 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891680 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891685 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891689 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891692 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891695 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891698 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891701 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891704 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891707 2565 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891710 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:58:35.893461 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891713 2565 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891716 2565 flags.go:64] FLAG: --cloud-config="" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891718 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891721 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891725 2565 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891727 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891730 2565 flags.go:64] FLAG: --config-dir="" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891733 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891736 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891740 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891743 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891746 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891748 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891752 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891755 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891758 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891761 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891764 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891768 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891772 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891775 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891777 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891781 2565 flags.go:64] FLAG: --enable-server="true" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891784 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891788 2565 flags.go:64] FLAG: --event-burst="100" Apr 22 15:58:35.894033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891791 2565 flags.go:64] FLAG: --event-qps="50" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891794 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891797 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891800 2565 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891803 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891806 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891809 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891812 2565 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891814 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891817 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891820 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891823 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891826 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891829 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891831 2565 flags.go:64] FLAG: --feature-gates="" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891835 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891838 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891841 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891844 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891846 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891852 2565 flags.go:64] FLAG: --help="false" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891855 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-135-152.ec2.internal" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891858 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891860 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:58:35.894659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891863 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891867 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891870 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891873 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891876 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891878 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891882 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891884 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891887 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891890 2565 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891893 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891896 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891899 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891901 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891904 2565 flags.go:64] FLAG: --lock-file="" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891907 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891910 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891912 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891917 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891920 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891923 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891926 2565 flags.go:64] FLAG: --logging-format="text" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891928 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891932 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:58:35.895271 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891934 2565 flags.go:64] FLAG: --manifest-url="" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891937 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891941 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891944 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891950 2565 flags.go:64] FLAG: --max-pods="110" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891954 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891957 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891960 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891963 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891965 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891968 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891972 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891979 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891982 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891985 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891988 2565 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891991 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891996 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.891999 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892002 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892004 2565 flags.go:64] FLAG: --port="10250" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892007 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892010 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ab3d326ad3c30bef" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892013 2565 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:58:35.895848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892016 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892019 2565 flags.go:64] FLAG: --register-node="true" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892022 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892024 2565 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892028 2565 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892030 2565 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892033 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892036 2565 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892039 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892042 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892045 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892048 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892055 2565 flags.go:64] FLAG: --runonce="false" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892058 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892061 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892064 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892067 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892069 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892072 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892075 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892078 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892081 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892101 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892105 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892108 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892112 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:58:35.896443 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892115 2565 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892118 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892123 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892126 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892128 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892132 2565 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892135 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892138 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892140 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892143 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892146 2565 flags.go:64] FLAG: --v="2" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892150 2565 flags.go:64] FLAG: --version="false" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892154 2565 flags.go:64] FLAG: --vmodule="" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892158 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.892161 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892257 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892261 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892263 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892267 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892270 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892274 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892277 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:35.897202 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892280 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892283 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892285 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892288 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892291 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892294 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892296 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892300 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892303 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892307 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892310 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892313 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892315 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892318 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892320 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892322 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892325 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892327 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892330 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:35.897730 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892332 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892335 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892337 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892340 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892342 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892345 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892347 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892349 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892352 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892355 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892358 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892361 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892363 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892366 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892368 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892371 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892373 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892376 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892378 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:35.898242 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892381 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892384 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892386 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892388 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892392 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892394 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892397 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892399 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892402 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892404 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892407 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892409 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892411 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892413 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892416 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892418 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892421 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892423 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892426 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892429 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:35.898751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892431 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892434 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892438 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892440 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892443 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892445 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892448 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892450 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892452 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892455 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892457 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892459 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892463 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892465 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892468 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892470 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892473 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892476 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892479 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892481 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:35.899244 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.892484 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:35.899724 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.893075 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:35.900435 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.900417 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:58:35.900470 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.900438 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:58:35.900503 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900493 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:35.900503 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900498 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:35.900503 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900501 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900505 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900510 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900514 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900519 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900523 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900526 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900529 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900531 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900534 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900536 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900539 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900542 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900544 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900547 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900549 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900552 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900554 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900556 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900559 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:35.900577 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900561 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900563 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900566 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900568 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900571 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900574 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900576 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900579 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900581 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900585 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900591 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900595 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900600 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900605 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900609 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900612 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900614 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900617 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900620 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900622 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:35.901109 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900625 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900627 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900630 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900633 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900635 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900638 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900640 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900642 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900645 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900647 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900650 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900652 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900654 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900657 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900660 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900663 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900666 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900673 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900679 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900684 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:35.901597 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900686 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900689 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900697 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900699 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900702 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900704 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900707 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900709 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900712 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900714 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900717 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900719 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900721 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900724 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900726 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900729 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900731 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900734 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900736 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900739 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:35.902099 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900741 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900744 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900747 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900751 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.900759 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900878 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900883 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900887 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900889 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900893 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900898 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900901 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900904 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900907 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900911 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:58:35.902598 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900916 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900921 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900926 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900930 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900932 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900935 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900938 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900941 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900943 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900946 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900948 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900951 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900953 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900956 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900959 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900961 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900964 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900968 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900970 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900973 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:58:35.902966 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900975 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900978 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900980 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900983 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900986 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900989 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900993 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.900997 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901001 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901005 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901009 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901012 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901014 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901017 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901019 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901022 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901024 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901026 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901029 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901031 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:58:35.903452 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901034 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901036 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901038 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901041 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901043 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901046 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901048 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901050 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901053 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901055 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901058 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901060 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901062 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901065 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901067 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901071 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901076 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901080 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901100 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901104 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:58:35.903979 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901106 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901109 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901112 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901115 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901117 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901120 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901124 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901126 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901128 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901131 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901133 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901136 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901139 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901141 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901144 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:58:35.904479 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:35.901147 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.901153 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.901267 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.903407 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.904422 2565 server.go:1019] "Starting client certificate rotation" Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.904517 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:35.904864 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.904554 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:58:35.930224 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.930210 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:35.932460 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.932444 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:58:35.948377 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.948356 2565 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:58:35.955233 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.955211 2565 log.go:25] "Validated CRI v1 image API" Apr 22 15:58:35.956399 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.956379 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:58:35.961408 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.960820 2565 fs.go:135] Filesystem UUIDs: map[0e66bf52-2136-4242-9536-9cc66082bbba:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ae35c8ab-9b52-4291-ac4e-1a08ecf03a16:/dev/nvme0n1p3] Apr 22 15:58:35.961408 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.961401 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:35.961534 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.961406 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:58:35.967201 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.967074 2565 manager.go:217] Machine: {Timestamp:2026-04-22 15:58:35.965224272 +0000 UTC m=+0.400317960 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099190 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bbfd2373c24e3510d89277cc07cb7 SystemUUID:ec2bbfd2-373c-24e3-510d-89277cc07cb7 BootID:b1da7925-ae9d-4acf-bcee-f618c7721ff9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b7:e2:34:5c:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b7:e2:34:5c:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:e1:bf:6e:f8:73 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:58:35.967201 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.967195 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:58:35.967315 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.967304 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:58:35.968357 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.968336 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:58:35.968491 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.968360 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-152.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:58:35.968538 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.968500 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:58:35.968538 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.968509 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:58:35.968538 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.968521 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:35.969251 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.969241 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:58:35.970557 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.970546 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:35.970664 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.970656 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:58:35.973045 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.973036 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:58:35.973081 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.973049 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:58:35.973081 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.973061 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:58:35.973081 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.973070 2565 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:58:35.973081 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.973079 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:58:35.974047 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.974035 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:35.974105 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.974054 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:58:35.977046 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.977026 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:58:35.978320 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.978305 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:58:35.979532 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979516 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:58:35.979532 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979534 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979541 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979546 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979552 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979558 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979564 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979569 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979576 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979581 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979589 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:58:35.979633 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.979598 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:58:35.980392 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.980382 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:58:35.980392 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.980392 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:58:35.984138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.983995 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:58:35.984226 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.984162 2565 server.go:1295] "Started kubelet" Apr 22 15:58:35.984594 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.984452 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:58:35.984670 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.984643 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:58:35.984749 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.984735 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:58:35.985070 ip-10-0-135-152 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:58:35.986767 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.986743 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-152.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:58:35.986942 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:35.986784 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:58:35.988288 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.988268 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:58:35.988642 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:35.987135 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-152.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:58:35.989297 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.988852 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:58:35.994565 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:35.994546 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:58:35.994788 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:35.993873 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-152.ec2.internal.18a8b90c6b08a8eb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-152.ec2.internal,UID:ip-10-0-135-152.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-152.ec2.internal,},FirstTimestamp:2026-04-22 15:58:35.984136427 +0000 UTC m=+0.419230116,LastTimestamp:2026-04-22 15:58:35.984136427 +0000 UTC m=+0.419230116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-152.ec2.internal,}" Apr 22 15:58:35.994894 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.994851 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:35.995435 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.995421 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:58:35.996142 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996124 2565 factory.go:55] Registering systemd factory Apr 22 15:58:35.996236 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996164 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:58:35.996236 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996180 2565 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:58:35.996363 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996341 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:58:35.996363 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996366 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:35.996369 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996408 2565 factory.go:153] Registering CRI-O factory Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996423 2565 factory.go:223] Registration of the crio container factory successfully Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996447 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996460 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996469 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996493 2565 factory.go:103] Registering Raw factory Apr 22 15:58:35.996508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.996508 2565 manager.go:1196] Started watching for new ooms in manager Apr 22 15:58:35.997399 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:35.997384 2565 manager.go:319] Starting recovery of all containers Apr 22 15:58:36.001832 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.001806 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-152.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:58:36.001956 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.001936 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:58:36.007985 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.007969 2565 manager.go:324] Recovery completed Apr 22 15:58:36.009768 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.009751 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rhk59" Apr 22 15:58:36.012442 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.012428 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.014814 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.014796 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rhk59" Apr 22 15:58:36.014903 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.014818 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.014903 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.014878 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.014903 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.014894 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.015456 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.015441 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:58:36.015456 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.015456 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:58:36.015552 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.015470 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:58:36.016608 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.016415 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-152.ec2.internal.18a8b90c6cdd4fd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-152.ec2.internal,UID:ip-10-0-135-152.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-152.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-152.ec2.internal,},FirstTimestamp:2026-04-22 15:58:36.014850007 +0000 UTC m=+0.449943697,LastTimestamp:2026-04-22 15:58:36.014850007 +0000 UTC m=+0.449943697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-152.ec2.internal,}" Apr 22 15:58:36.019577 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.019558 2565 policy_none.go:49] "None policy: Start" Apr 22 15:58:36.019577 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.019577 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:58:36.019676 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.019587 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.055625 2565 manager.go:341] "Starting Device Plugin manager" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.055650 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.055664 2565 server.go:85] "Starting device plugin registration server" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.055895 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.055905 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.056001 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.056102 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.056112 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.056574 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:58:36.075138 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.056621 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.119493 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.119460 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:58:36.120746 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.120728 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:58:36.120807 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.120761 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:58:36.120807 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.120783 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:58:36.120807 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.120792 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:58:36.120920 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.120830 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:58:36.123318 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.123296 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:36.156781 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.156731 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.157646 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.157633 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.157700 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.157663 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.157700 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.157673 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.157700 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.157693 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.166063 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.166050 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.166146 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.166068 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-152.ec2.internal\": node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.196836 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.196817 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.221283 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.221256 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal"] Apr 22 15:58:36.221362 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.221318 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.222114 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.222098 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.222179 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.222128 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.222179 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.222143 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.224377 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.224367 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.224504 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.224492 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.224556 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.224518 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.225007 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.224993 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.225069 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.225011 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.225069 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.225035 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.225069 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.225048 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.225179 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.225016 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.225179 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.225121 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.227347 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.227332 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.227431 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.227357 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:58:36.227935 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.227918 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:58:36.228025 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.227944 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:58:36.228025 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.227953 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:58:36.256630 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.256615 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-152.ec2.internal\" not found" node="ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.260466 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.260450 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-152.ec2.internal\" not found" node="ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.297000 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.296977 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.397491 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.397463 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.397608 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.397537 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.397608 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.397562 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.397608 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.397578 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/04b47764dfb923467fa0be6032d47f9e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-152.ec2.internal\" (UID: \"04b47764dfb923467fa0be6032d47f9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498213 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.498140 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.498213 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498213 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498204 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498213 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498222 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/04b47764dfb923467fa0be6032d47f9e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-152.ec2.internal\" (UID: \"04b47764dfb923467fa0be6032d47f9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498391 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498391 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498259 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/53bf7234ced363a99f60729950d50036-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal\" (UID: \"53bf7234ced363a99f60729950d50036\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.498391 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.498300 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/04b47764dfb923467fa0be6032d47f9e-config\") pod \"kube-apiserver-proxy-ip-10-0-135-152.ec2.internal\" (UID: \"04b47764dfb923467fa0be6032d47f9e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.559400 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.559359 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.563027 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.563009 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:36.599129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.599101 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.699527 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.699494 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.799892 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.799840 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.900300 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:36.900279 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:36.904467 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.904452 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:58:36.904598 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.904583 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:58:36.995609 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:36.995579 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:58:37.000724 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:37.000704 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:37.006578 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.006556 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:58:37.017513 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.017477 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:53:36 +0000 UTC" deadline="2027-11-01 06:13:02.627151773 +0000 UTC" Apr 22 15:58:37.017513 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.017508 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13382h14m25.609646616s" Apr 22 15:58:37.025788 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.025771 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rn5kh" Apr 22 15:58:37.033555 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.033529 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rn5kh" Apr 22 15:58:37.101140 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:37.101110 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-152.ec2.internal\" not found" Apr 22 15:58:37.175568 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.175543 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:37.189372 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:37.189347 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b47764dfb923467fa0be6032d47f9e.slice/crio-c7f14f8f70152b2b9bebaae25069711904f6b9e0fb8dc567f410909d1a7fab0b WatchSource:0}: Error finding container c7f14f8f70152b2b9bebaae25069711904f6b9e0fb8dc567f410909d1a7fab0b: Status 404 returned error can't find the container with id c7f14f8f70152b2b9bebaae25069711904f6b9e0fb8dc567f410909d1a7fab0b Apr 22 15:58:37.189751 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:37.189724 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53bf7234ced363a99f60729950d50036.slice/crio-f927016926cc3c3bd9a49e62029a3646f011d2d779e9ad9ef1806e1452f15f69 WatchSource:0}: Error finding container f927016926cc3c3bd9a49e62029a3646f011d2d779e9ad9ef1806e1452f15f69: Status 404 returned error can't find the container with id f927016926cc3c3bd9a49e62029a3646f011d2d779e9ad9ef1806e1452f15f69 Apr 22 15:58:37.194198 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.194183 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:58:37.195792 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.195774 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" Apr 22 15:58:37.204758 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.204737 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:37.207630 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.207616 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" Apr 22 15:58:37.217449 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.217433 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:58:37.393448 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.393374 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:37.481039 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.481008 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:37.931265 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.931225 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:58:37.974366 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.974140 2565 apiserver.go:52] "Watching apiserver" Apr 22 15:58:37.981020 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.980999 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:58:37.981517 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.981490 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-rlxkj","kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc","openshift-dns/node-resolver-k6xqz","openshift-network-diagnostics/network-check-target-57wkv","openshift-cluster-node-tuning-operator/tuned-kksz7","openshift-image-registry/node-ca-9cj4f","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal","openshift-multus/multus-4j4vm","openshift-multus/multus-additional-cni-plugins-c7hk6","openshift-multus/network-metrics-daemon-2nbv7","openshift-network-operator/iptables-alerter-8dxdj","openshift-ovn-kubernetes/ovnkube-node-95pm2","kube-system/global-pull-secret-syncer-h264g"] Apr 22 15:58:37.986924 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.986901 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:37.989414 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.989393 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:58:37.989499 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.989455 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:37.989659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.989643 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:58:37.989808 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.989796 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:58:37.989973 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.989960 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d8m5g\"" Apr 22 15:58:37.990074 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.990064 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:58:37.990722 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.990702 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:58:37.990963 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.990947 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:58:37.991299 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.991277 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:58:37.992674 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.992059 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:37.992674 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.992182 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:58:37.992674 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.992436 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wnwkd\"" Apr 22 15:58:37.992674 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.992605 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:58:37.994689 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.994666 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:37.995048 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.995033 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:58:37.995154 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.995077 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:58:37.995509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.995476 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hfl9r\"" Apr 22 15:58:37.997313 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.997295 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:58:37.997615 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.997598 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:58:37.997874 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.997857 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pmvxt\"" Apr 22 15:58:37.998043 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:37.998029 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:58:38.000690 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.000299 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:38.000690 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.000377 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:38.000690 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.000539 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.007815 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007598 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-script-lib\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.007815 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007642 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-netns\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.007815 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007675 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-systemd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.007815 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007772 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-ovn\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.007815 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007832 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-netd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007877 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-sys-fs\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.008049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.007913 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-systemd\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008014 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-slash\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008045 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-log-socket\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008069 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-bin\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008117 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008167 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-env-overrides\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-sys\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008212 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-var-lib-kubelet\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008258 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008232 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gwxz\" (UniqueName: \"kubernetes.io/projected/4ed416cd-6dad-4f24-869d-bbc8be60891d-kube-api-access-2gwxz\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008271 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-registration-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008306 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008334 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-modprobe-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008367 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b5899ec-33ba-45f8-b259-82f0af9723a4-host\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008392 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-etc-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysconfig\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008451 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr9vm\" (UniqueName: \"kubernetes.io/projected/2b5899ec-33ba-45f8-b259-82f0af9723a4-kube-api-access-jr9vm\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008479 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-kubelet\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008508 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-systemd-units\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008543 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-var-lib-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008567 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-node-log\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008586 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8967a65-4b16-4099-9db4-ce8642ba6138-hosts-file\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-kubernetes\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008643 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b5899ec-33ba-45f8-b259-82f0af9723a4-serviceca\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008662 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008680 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4775e631-4da8-45cc-9fb4-6238451abe84-ovn-node-metrics-cert\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.008859 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008711 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-socket-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.009207 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.008918 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.009340 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009319 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.009449 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.009427 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:38.009839 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009815 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8967a65-4b16-4099-9db4-ce8642ba6138-tmp-dir\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.009926 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009864 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:38.009926 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009903 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.010035 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-conf\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.010035 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.009967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-lib-modules\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.010035 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010014 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-config\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.010174 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010030 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:38.010174 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010053 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtrj\" (UniqueName: \"kubernetes.io/projected/4775e631-4da8-45cc-9fb4-6238451abe84-kube-api-access-7wtrj\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.010174 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010112 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:38.010290 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010038 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7szk9\"" Apr 22 15:58:38.010766 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010747 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.010957 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010937 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:58:38.011033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.010964 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-device-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.011033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011016 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pns2p\" (UniqueName: \"kubernetes.io/projected/a8967a65-4b16-4099-9db4-ce8642ba6138-kube-api-access-pns2p\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.011164 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011049 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-run\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.011164 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011081 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-host\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.011269 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011167 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lckwt\" (UniqueName: \"kubernetes.io/projected/87be9666-efda-4ad0-a50e-fb952cc19860-kube-api-access-lckwt\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.011269 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011210 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-tuned\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.011269 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.011240 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-tmp\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.012195 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.012174 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.012962 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.012941 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:58:38.014473 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.014452 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mfpfn\"" Apr 22 15:58:38.014574 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.014553 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:58:38.014775 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.014758 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:58:38.015621 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.015603 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.015713 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.015663 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:38.016133 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.016114 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-pqx4x\"" Apr 22 15:58:38.016330 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.016315 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:58:38.017240 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.017217 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:58:38.018629 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.018612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.019600 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.019579 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.020929 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.020908 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8w28x\"" Apr 22 15:58:38.021171 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.021155 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:58:38.021847 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.021829 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:58:38.022187 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.022172 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:58:38.023674 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.022368 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4cp5h\"" Apr 22 15:58:38.023971 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.022427 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:58:38.023971 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.022506 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:58:38.035049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.035025 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:37 +0000 UTC" deadline="2027-09-22 13:34:26.395626996 +0000 UTC" Apr 22 15:58:38.035168 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.035059 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12429h35m48.360582254s" Apr 22 15:58:38.097395 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.097375 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:58:38.111770 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111746 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-bin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.111886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111785 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-konnectivity-ca\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.111886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111815 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-socket-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.111886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111836 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:38.111886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111853 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.111886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111870 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-conf\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111893 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-host\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111918 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-dbus\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111945 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111971 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fh92\" (UniqueName: \"kubernetes.io/projected/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-kube-api-access-9fh92\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.111996 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-device-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-kubelet-config\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112041 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-os-release\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lckwt\" (UniqueName: \"kubernetes.io/projected/87be9666-efda-4ad0-a50e-fb952cc19860-kube-api-access-lckwt\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.112113 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112108 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-netns\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112138 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-system-cni-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112162 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-system-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-script-lib\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112206 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-device-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-systemd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112246 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-systemd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112254 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-ovn\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112280 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112302 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-netd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112302 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-socket-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112324 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-systemd\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-log-socket\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112374 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-bin\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112401 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-env-overrides\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-sys\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112444 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gwxz\" (UniqueName: \"kubernetes.io/projected/4ed416cd-6dad-4f24-869d-bbc8be60891d-kube-api-access-2gwxz\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.112509 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b5899ec-33ba-45f8-b259-82f0af9723a4-host\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-cnibin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112632 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112665 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-ovn\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-netd\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112934 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-systemd\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112941 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-conf\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112969 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-log-socket\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.112990 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-sys\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113001 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-cni-bin\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b5899ec-33ba-45f8-b259-82f0af9723a4-host\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-registration-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113122 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113129 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysctl-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-modprobe-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113169 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-script-lib\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113199 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-etc-selinux\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113226 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-host\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113244 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-registration-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113286 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-iptables-alerter-script\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113304 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-modprobe-d\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysconfig\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113376 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr9vm\" (UniqueName: \"kubernetes.io/projected/2b5899ec-33ba-45f8-b259-82f0af9723a4-kube-api-access-jr9vm\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-conf-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113452 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-sysconfig\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz46x\" (UniqueName: \"kubernetes.io/projected/d9391598-fc74-406d-ad2b-087fbbe59063-kube-api-access-gz46x\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtfmb\" (UniqueName: \"kubernetes.io/projected/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-kube-api-access-rtfmb\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113538 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-kubelet\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-systemd-units\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113584 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8967a65-4b16-4099-9db4-ce8642ba6138-hosts-file\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113590 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-kubelet\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113606 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b5899ec-33ba-45f8-b259-82f0af9723a4-serviceca\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.113987 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113625 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-systemd-units\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113631 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-kubelet\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8967a65-4b16-4099-9db4-ce8642ba6138-hosts-file\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113657 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-multus-daemon-config\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4775e631-4da8-45cc-9fb4-6238451abe84-ovn-node-metrics-cert\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8967a65-4b16-4099-9db4-ce8642ba6138-tmp-dir\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113753 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-lib-modules\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113776 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-os-release\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113798 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-config\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113847 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtrj\" (UniqueName: \"kubernetes.io/projected/4775e631-4da8-45cc-9fb4-6238451abe84-kube-api-access-7wtrj\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113870 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113893 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pns2p\" (UniqueName: \"kubernetes.io/projected/a8967a65-4b16-4099-9db4-ce8642ba6138-kube-api-access-pns2p\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113916 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-run\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113939 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-socket-dir-parent\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113964 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-run-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.114759 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113964 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-k8s-cni-cncf-io\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113992 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-multus-certs\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114003 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-lib-modules\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-tuned\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114020 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-tmp\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113911 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-env-overrides\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.113941 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b5899ec-33ba-45f8-b259-82f0af9723a4-serviceca\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114153 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-cni-binary-copy\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114180 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-netns\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114209 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-sys-fs\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114233 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-var-lib-kubelet\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114258 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-etc-kubernetes\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114282 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-cnibin\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114306 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114332 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114394 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-var-lib-kubelet\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.115472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-run-netns\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114665 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8967a65-4b16-4099-9db4-ce8642ba6138-tmp-dir\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114685 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87be9666-efda-4ad0-a50e-fb952cc19860-sys-fs\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-slash\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114725 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-slash\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114763 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114818 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-multus\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114847 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hg4m\" (UniqueName: \"kubernetes.io/projected/98755c75-9268-4cfa-8cae-e8ccf20974be-kube-api-access-5hg4m\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114873 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114897 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-agent-certs\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114923 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-etc-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114948 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-hostroot\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.114985 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-host-slash\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115005 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-etc-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115023 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-var-lib-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-node-log\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115117 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-kubernetes\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.116003 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115122 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116593 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115146 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-var-lib-openvswitch\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116593 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115233 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4775e631-4da8-45cc-9fb4-6238451abe84-node-log\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.116593 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-kubernetes\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.116593 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ed416cd-6dad-4f24-869d-bbc8be60891d-run\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.116593 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.115610 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4775e631-4da8-45cc-9fb4-6238451abe84-ovnkube-config\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.117407 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.117346 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-tmp\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.117407 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.117363 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ed416cd-6dad-4f24-869d-bbc8be60891d-etc-tuned\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.119126 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.119105 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4775e631-4da8-45cc-9fb4-6238451abe84-ovn-node-metrics-cert\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.120555 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.119956 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:38.120555 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.119978 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:38.120555 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.119992 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:38.120555 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.120102 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:38.620036795 +0000 UTC m=+3.055130469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:38.122770 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.122749 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtrj\" (UniqueName: \"kubernetes.io/projected/4775e631-4da8-45cc-9fb4-6238451abe84-kube-api-access-7wtrj\") pod \"ovnkube-node-95pm2\" (UID: \"4775e631-4da8-45cc-9fb4-6238451abe84\") " pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.122867 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.122785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckwt\" (UniqueName: \"kubernetes.io/projected/87be9666-efda-4ad0-a50e-fb952cc19860-kube-api-access-lckwt\") pod \"aws-ebs-csi-driver-node-r6zxc\" (UID: \"87be9666-efda-4ad0-a50e-fb952cc19860\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.122957 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.122914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr9vm\" (UniqueName: \"kubernetes.io/projected/2b5899ec-33ba-45f8-b259-82f0af9723a4-kube-api-access-jr9vm\") pod \"node-ca-9cj4f\" (UID: \"2b5899ec-33ba-45f8-b259-82f0af9723a4\") " pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.124504 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.124287 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gwxz\" (UniqueName: \"kubernetes.io/projected/4ed416cd-6dad-4f24-869d-bbc8be60891d-kube-api-access-2gwxz\") pod \"tuned-kksz7\" (UID: \"4ed416cd-6dad-4f24-869d-bbc8be60891d\") " pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.126261 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.126203 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" event={"ID":"04b47764dfb923467fa0be6032d47f9e","Type":"ContainerStarted","Data":"c7f14f8f70152b2b9bebaae25069711904f6b9e0fb8dc567f410909d1a7fab0b"} Apr 22 15:58:38.127360 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.127319 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pns2p\" (UniqueName: \"kubernetes.io/projected/a8967a65-4b16-4099-9db4-ce8642ba6138-kube-api-access-pns2p\") pod \"node-resolver-k6xqz\" (UID: \"a8967a65-4b16-4099-9db4-ce8642ba6138\") " pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.127452 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.127388 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" event={"ID":"53bf7234ced363a99f60729950d50036","Type":"ContainerStarted","Data":"f927016926cc3c3bd9a49e62029a3646f011d2d779e9ad9ef1806e1452f15f69"} Apr 22 15:58:38.215437 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215357 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-system-cni-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.215437 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215398 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-system-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215437 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-cnibin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215457 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-system-cni-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215483 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-iptables-alerter-script\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215529 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-cnibin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215538 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-conf-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215579 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-system-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215583 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-conf-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz46x\" (UniqueName: \"kubernetes.io/projected/d9391598-fc74-406d-ad2b-087fbbe59063-kube-api-access-gz46x\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.215682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215658 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtfmb\" (UniqueName: \"kubernetes.io/projected/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-kube-api-access-rtfmb\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215693 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-kubelet\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-multus-daemon-config\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215749 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-os-release\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215768 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215775 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215806 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-socket-dir-parent\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-k8s-cni-cncf-io\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215855 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-multus-certs\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215883 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215909 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-cni-binary-copy\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215970 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-etc-kubernetes\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.215995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-cnibin\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216021 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216060 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-multus\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216105 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hg4m\" (UniqueName: \"kubernetes.io/projected/98755c75-9268-4cfa-8cae-e8ccf20974be-kube-api-access-5hg4m\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216131 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.216181 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216156 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-agent-certs\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216185 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-hostroot\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216209 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-host-slash\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216240 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-bin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216245 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-iptables-alerter-script\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-konnectivity-ca\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216310 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-dbus\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216321 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-cni-dir\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216339 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216367 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fh92\" (UniqueName: \"kubernetes.io/projected/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-kube-api-access-9fh92\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216395 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-kubelet-config\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216432 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-os-release\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-netns\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.215663 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216537 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-netns\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.216592 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:38.716574442 +0000 UTC m=+3.151668119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216711 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-kubelet\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.216972 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.216975 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-multus-socket-dir-parent\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217049 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-os-release\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217142 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-etc-kubernetes\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-cnibin\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217203 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-multus-daemon-config\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217234 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-multus\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.217412 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.217465 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:38.717454264 +0000 UTC m=+3.152547940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217562 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-k8s-cni-cncf-io\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217583 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217588 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-run-multus-certs\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217679 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-dbus\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-host-var-lib-cni-bin\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217715 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-host-slash\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217756 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9391598-fc74-406d-ad2b-087fbbe59063-hostroot\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217773 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8b2db19d-176e-4219-830f-a3b6ed5a34e0-kubelet-config\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.217932 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.217785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/98755c75-9268-4cfa-8cae-e8ccf20974be-os-release\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.218641 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.218231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/98755c75-9268-4cfa-8cae-e8ccf20974be-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.218641 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.218241 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9391598-fc74-406d-ad2b-087fbbe59063-cni-binary-copy\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.218641 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.218256 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-konnectivity-ca\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.220242 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.220222 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/590a23f3-74f3-406f-a0f0-bf1db0f7b0a0-agent-certs\") pod \"konnectivity-agent-rlxkj\" (UID: \"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0\") " pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.227147 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.227127 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtfmb\" (UniqueName: \"kubernetes.io/projected/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-kube-api-access-rtfmb\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.227147 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.227136 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz46x\" (UniqueName: \"kubernetes.io/projected/d9391598-fc74-406d-ad2b-087fbbe59063-kube-api-access-gz46x\") pod \"multus-4j4vm\" (UID: \"d9391598-fc74-406d-ad2b-087fbbe59063\") " pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.227401 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.227383 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fh92\" (UniqueName: \"kubernetes.io/projected/d8b2518b-8eb8-4a43-bf50-6f370663ae7c-kube-api-access-9fh92\") pod \"iptables-alerter-8dxdj\" (UID: \"d8b2518b-8eb8-4a43-bf50-6f370663ae7c\") " pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.227851 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.227831 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hg4m\" (UniqueName: \"kubernetes.io/projected/98755c75-9268-4cfa-8cae-e8ccf20974be-kube-api-access-5hg4m\") pod \"multus-additional-cni-plugins-c7hk6\" (UID: \"98755c75-9268-4cfa-8cae-e8ccf20974be\") " pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.301137 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.301071 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:58:38.317304 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.316974 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" Apr 22 15:58:38.326782 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.326755 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k6xqz" Apr 22 15:58:38.342360 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.342339 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:58:38.349870 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.349848 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9cj4f" Apr 22 15:58:38.355473 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.355455 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kksz7" Apr 22 15:58:38.362044 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.362023 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4j4vm" Apr 22 15:58:38.369632 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.369612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" Apr 22 15:58:38.376229 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.376212 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8dxdj" Apr 22 15:58:38.719920 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.719740 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.719941 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:38.719972 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.719885 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720070 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:39.720045722 +0000 UTC m=+4.155139408 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720076 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720115 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:38.720129 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720132 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:38.720520 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720146 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:38.720520 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720146 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:39.720132494 +0000 UTC m=+4.155226185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:38.720520 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:38.720196 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:39.720188147 +0000 UTC m=+4.155281822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:38.866747 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.866715 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98755c75_9268_4cfa_8cae_e8ccf20974be.slice/crio-49ae681d8fe668f610dea3f30fe8142873ac9b84e68fbfb04e60c534199ec5d5 WatchSource:0}: Error finding container 49ae681d8fe668f610dea3f30fe8142873ac9b84e68fbfb04e60c534199ec5d5: Status 404 returned error can't find the container with id 49ae681d8fe668f610dea3f30fe8142873ac9b84e68fbfb04e60c534199ec5d5 Apr 22 15:58:38.871124 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.871080 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87be9666_efda_4ad0_a50e_fb952cc19860.slice/crio-0cde039c4b1a3dff24d344fbb101e86389795fff2c22f1e6780832e1f24d3694 WatchSource:0}: Error finding container 0cde039c4b1a3dff24d344fbb101e86389795fff2c22f1e6780832e1f24d3694: Status 404 returned error can't find the container with id 0cde039c4b1a3dff24d344fbb101e86389795fff2c22f1e6780832e1f24d3694 Apr 22 15:58:38.871607 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.871588 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8967a65_4b16_4099_9db4_ce8642ba6138.slice/crio-c6a041e069634d6de4b984fc1a782b3d6146217538a26487227e8089effe1ab2 WatchSource:0}: Error finding container c6a041e069634d6de4b984fc1a782b3d6146217538a26487227e8089effe1ab2: Status 404 returned error can't find the container with id c6a041e069634d6de4b984fc1a782b3d6146217538a26487227e8089effe1ab2 Apr 22 15:58:38.872993 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.872970 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b5899ec_33ba_45f8_b259_82f0af9723a4.slice/crio-d03ade2a1f7f9f3550ed92fb0ac0efb029f8c1a1ba14ca5fd0320900e9051782 WatchSource:0}: Error finding container d03ade2a1f7f9f3550ed92fb0ac0efb029f8c1a1ba14ca5fd0320900e9051782: Status 404 returned error can't find the container with id d03ade2a1f7f9f3550ed92fb0ac0efb029f8c1a1ba14ca5fd0320900e9051782 Apr 22 15:58:38.893376 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.893354 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4775e631_4da8_45cc_9fb4_6238451abe84.slice/crio-45f2c87bb83d8de7fa0e814b962aad7acee0dc1e4896c53aae0ce77471d0c390 WatchSource:0}: Error finding container 45f2c87bb83d8de7fa0e814b962aad7acee0dc1e4896c53aae0ce77471d0c390: Status 404 returned error can't find the container with id 45f2c87bb83d8de7fa0e814b962aad7acee0dc1e4896c53aae0ce77471d0c390 Apr 22 15:58:38.894275 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.894244 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b2518b_8eb8_4a43_bf50_6f370663ae7c.slice/crio-511ea2a0931fb850e73e91f37544c0d4135160e4edc9abb1ccdafb1b47949de7 WatchSource:0}: Error finding container 511ea2a0931fb850e73e91f37544c0d4135160e4edc9abb1ccdafb1b47949de7: Status 404 returned error can't find the container with id 511ea2a0931fb850e73e91f37544c0d4135160e4edc9abb1ccdafb1b47949de7 Apr 22 15:58:38.895225 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:58:38.895207 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed416cd_6dad_4f24_869d_bbc8be60891d.slice/crio-767bce721b5e003aaeb6025b75e86e310a0c50ce452ddf92c83bd586f78cbfc7 WatchSource:0}: Error finding container 767bce721b5e003aaeb6025b75e86e310a0c50ce452ddf92c83bd586f78cbfc7: Status 404 returned error can't find the container with id 767bce721b5e003aaeb6025b75e86e310a0c50ce452ddf92c83bd586f78cbfc7 Apr 22 15:58:39.035277 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.035191 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:53:37 +0000 UTC" deadline="2027-11-26 20:52:08.306861104 +0000 UTC" Apr 22 15:58:39.035277 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.035217 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13996h53m29.271645932s" Apr 22 15:58:39.129867 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.129830 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerStarted","Data":"49ae681d8fe668f610dea3f30fe8142873ac9b84e68fbfb04e60c534199ec5d5"} Apr 22 15:58:39.130785 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.130761 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8dxdj" event={"ID":"d8b2518b-8eb8-4a43-bf50-6f370663ae7c","Type":"ContainerStarted","Data":"511ea2a0931fb850e73e91f37544c0d4135160e4edc9abb1ccdafb1b47949de7"} Apr 22 15:58:39.131741 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.131718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4j4vm" event={"ID":"d9391598-fc74-406d-ad2b-087fbbe59063","Type":"ContainerStarted","Data":"54a61793dc89c5a10ac2222cf83dc86979a40a36435b54b1181806450cecbbfe"} Apr 22 15:58:39.132602 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.132566 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k6xqz" event={"ID":"a8967a65-4b16-4099-9db4-ce8642ba6138","Type":"ContainerStarted","Data":"c6a041e069634d6de4b984fc1a782b3d6146217538a26487227e8089effe1ab2"} Apr 22 15:58:39.133530 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.133509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" event={"ID":"87be9666-efda-4ad0-a50e-fb952cc19860","Type":"ContainerStarted","Data":"0cde039c4b1a3dff24d344fbb101e86389795fff2c22f1e6780832e1f24d3694"} Apr 22 15:58:39.134964 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.134944 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" event={"ID":"04b47764dfb923467fa0be6032d47f9e","Type":"ContainerStarted","Data":"3486622aacdeaebc51f5f338bc91378adb18e5015348345d2f7cb03ebadb8a60"} Apr 22 15:58:39.136056 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.136030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kksz7" event={"ID":"4ed416cd-6dad-4f24-869d-bbc8be60891d","Type":"ContainerStarted","Data":"767bce721b5e003aaeb6025b75e86e310a0c50ce452ddf92c83bd586f78cbfc7"} Apr 22 15:58:39.137058 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.137038 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"45f2c87bb83d8de7fa0e814b962aad7acee0dc1e4896c53aae0ce77471d0c390"} Apr 22 15:58:39.137813 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.137796 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rlxkj" event={"ID":"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0","Type":"ContainerStarted","Data":"7b9194a25554455a637baaca70e0632ec1d5b83d821340059aca4836a228870f"} Apr 22 15:58:39.138658 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.138639 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9cj4f" event={"ID":"2b5899ec-33ba-45f8-b259-82f0af9723a4","Type":"ContainerStarted","Data":"d03ade2a1f7f9f3550ed92fb0ac0efb029f8c1a1ba14ca5fd0320900e9051782"} Apr 22 15:58:39.148536 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.148491 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-152.ec2.internal" podStartSLOduration=2.148477817 podStartE2EDuration="2.148477817s" podCreationTimestamp="2026-04-22 15:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:58:39.148385714 +0000 UTC m=+3.583479411" watchObservedRunningTime="2026-04-22 15:58:39.148477817 +0000 UTC m=+3.583571519" Apr 22 15:58:39.728767 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.728736 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:39.728878 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.728802 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:39.728878 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:39.728861 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:39.729001 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.728968 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:39.729051 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729024 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:41.729006328 +0000 UTC m=+6.164100009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:39.729428 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729409 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:39.729492 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729434 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:39.729492 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729447 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:39.729492 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729490 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:41.729475333 +0000 UTC m=+6.164569023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:39.729669 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729551 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:39.729669 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:39.729584 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:41.729572643 +0000 UTC m=+6.164666319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:40.123808 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:40.123714 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:40.124279 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:40.123851 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:40.124279 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:40.124267 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:40.124396 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:40.124356 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:40.124446 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:40.124434 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:40.124535 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:40.124515 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:40.151724 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:40.150824 2565 generic.go:358] "Generic (PLEG): container finished" podID="53bf7234ced363a99f60729950d50036" containerID="19a3f023a8b4b7b2716e1888038e875b18602da438cf339d0b3d1368d3b787d6" exitCode=0 Apr 22 15:58:40.151724 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:40.151680 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" event={"ID":"53bf7234ced363a99f60729950d50036","Type":"ContainerDied","Data":"19a3f023a8b4b7b2716e1888038e875b18602da438cf339d0b3d1368d3b787d6"} Apr 22 15:58:41.184940 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:41.184888 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" event={"ID":"53bf7234ced363a99f60729950d50036","Type":"ContainerStarted","Data":"7e83c09de83556d606d4a26397be463a816b0787ee371585164d6949cd490bf5"} Apr 22 15:58:41.746561 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:41.746523 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:41.746731 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:41.746582 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:41.746731 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:41.746622 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:41.746849 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.746739 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:41.746849 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.746802 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.746784634 +0000 UTC m=+10.181878326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747222 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747279 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.747262466 +0000 UTC m=+10.182356141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747356 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747369 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747382 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:41.747445 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:41.747416 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:45.7474042 +0000 UTC m=+10.182497881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:42.123243 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:42.123259 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:42.123367 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:42.123400 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:42.123521 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:42.123749 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:42.123601 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:44.122307 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:44.122221 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:44.122769 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:44.122361 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:44.122769 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:44.122658 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:44.122769 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:44.122672 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:44.122930 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:44.122794 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:44.122930 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:44.122848 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:45.784144 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:45.784107 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:45.784564 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:45.784189 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:45.784564 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:45.784236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:45.784564 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784365 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:45.784564 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784421 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:53.784402776 +0000 UTC m=+18.219496458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:45.784814 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784797 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.784868 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784854 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:53.784837182 +0000 UTC m=+18.219930883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:45.784945 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784931 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:45.784994 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784951 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:45.784994 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784963 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:45.785060 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:45.784999 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:58:53.784987505 +0000 UTC m=+18.220081184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:46.121979 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:46.121906 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:46.122127 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:46.121987 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:46.122316 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:46.122301 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:46.122382 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:46.122367 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:46.122546 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:46.122526 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:46.122681 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:46.122623 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:48.121980 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:48.121947 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:48.122419 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:48.122069 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:48.122419 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:48.121949 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:48.122419 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:48.122194 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:48.122419 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:48.121949 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:48.122419 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:48.122275 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:50.121726 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:50.121694 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:50.122185 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:50.121704 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:50.122185 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:50.121808 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:50.122185 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:50.121930 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:50.122185 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:50.121942 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:50.122185 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:50.122064 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:52.121281 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:52.121247 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:52.121644 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:52.121247 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:52.121644 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:52.121374 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:52.121644 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:52.121259 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:52.121644 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:52.121421 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:52.121644 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:52.121510 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:53.848591 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:53.848550 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:53.848640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:53.848674 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848730 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848795 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret podName:8b2db19d-176e-4219-830f-a3b6ed5a34e0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.848780818 +0000 UTC m=+34.283874506 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret") pod "global-pull-secret-syncer-h264g" (UID: "8b2db19d-176e-4219-830f-a3b6ed5a34e0") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848806 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848807 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848824 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848836 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dmzmd for pod openshift-network-diagnostics/network-check-target-57wkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848873 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.848854925 +0000 UTC m=+34.283948616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:58:53.849114 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:53.848894 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd podName:edee20cb-f531-4653-852f-f16cccf9f024 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.84888304 +0000 UTC m=+34.283976732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dmzmd" (UniqueName: "kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd") pod "network-check-target-57wkv" (UID: "edee20cb-f531-4653-852f-f16cccf9f024") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:58:54.121866 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:54.121784 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:54.121866 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:54.121803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:54.122201 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:54.121784 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:54.122201 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:54.121915 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:54.122201 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:54.121998 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:54.122201 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:54.122066 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:56.122520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.122494 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:56.122880 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:56.122572 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:56.122880 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.122659 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:56.122880 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:56.122751 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:56.122880 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.122789 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:56.122880 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:56.122851 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:56.211161 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.211128 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerStarted","Data":"c8d57919eea6ebb529309b8453a36c1a740b358853b3f416dcd5200568135a7e"} Apr 22 15:58:56.212533 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.212510 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" event={"ID":"87be9666-efda-4ad0-a50e-fb952cc19860","Type":"ContainerStarted","Data":"c411d6eed53b33d6715deb824a589a991891e658350d637c46ebf4ee2c08030b"} Apr 22 15:58:56.234566 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:56.234259 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-152.ec2.internal" podStartSLOduration=19.234239577 podStartE2EDuration="19.234239577s" podCreationTimestamp="2026-04-22 15:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:58:41.201048878 +0000 UTC m=+5.636142576" watchObservedRunningTime="2026-04-22 15:58:56.234239577 +0000 UTC m=+20.669333275" Apr 22 15:58:57.215447 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.215200 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4j4vm" event={"ID":"d9391598-fc74-406d-ad2b-087fbbe59063","Type":"ContainerStarted","Data":"149b87d7fbbc39145622728d8452a84ac07336c8555092854443f7e612257837"} Apr 22 15:58:57.216536 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.216508 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k6xqz" event={"ID":"a8967a65-4b16-4099-9db4-ce8642ba6138","Type":"ContainerStarted","Data":"f3804e8b3967ab66a7861ce99a1e84c636a77eab63fa6af96554748595f6b8a3"} Apr 22 15:58:57.217722 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.217695 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kksz7" event={"ID":"4ed416cd-6dad-4f24-869d-bbc8be60891d","Type":"ContainerStarted","Data":"736cd781e4406238bef6629a261488f8255967a2a1983db63448f0650a40d192"} Apr 22 15:58:57.220049 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220031 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 15:58:57.220390 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220370 2565 generic.go:358] "Generic (PLEG): container finished" podID="4775e631-4da8-45cc-9fb4-6238451abe84" containerID="1d31d329eee6feaef6de4adb53658ed2aace985549ca0eb99ad4a1ecce0b0ab1" exitCode=1 Apr 22 15:58:57.220460 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220434 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"ca9a6381b230b59a8e81c6ebc0c180a3fb7900e28c11ccc7e22b377bcedb0eb4"} Apr 22 15:58:57.220520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220459 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"81365caf1fc22fef30eb373bbf49f56d8f9ae5145da37c5a14bc9ebc44a70e28"} Apr 22 15:58:57.220520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220468 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"a271496c3424194a1b08dacb8441c20aa885dbdff6596caeece6fdae2c3f9801"} Apr 22 15:58:57.220520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220476 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"e55ce78c5ad420745a3ee7e19eafeef9c028425c226428cf9405e894f646f727"} Apr 22 15:58:57.220520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220484 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerDied","Data":"1d31d329eee6feaef6de4adb53658ed2aace985549ca0eb99ad4a1ecce0b0ab1"} Apr 22 15:58:57.220520 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.220499 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"84effa191261b619db7f0cd53ec5c3376bb89fe95ea76fa7a899eb860ca4db8a"} Apr 22 15:58:57.221600 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.221582 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rlxkj" event={"ID":"590a23f3-74f3-406f-a0f0-bf1db0f7b0a0","Type":"ContainerStarted","Data":"678576d690304200922f381d7d1b692acd4a27543352636347674be8f785945e"} Apr 22 15:58:57.222730 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.222714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9cj4f" event={"ID":"2b5899ec-33ba-45f8-b259-82f0af9723a4","Type":"ContainerStarted","Data":"b499a4b7c944d998a9d704c836ab002fa892ec893322acdaa267d54649249dd1"} Apr 22 15:58:57.223967 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.223949 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="c8d57919eea6ebb529309b8453a36c1a740b358853b3f416dcd5200568135a7e" exitCode=0 Apr 22 15:58:57.224036 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.223975 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"c8d57919eea6ebb529309b8453a36c1a740b358853b3f416dcd5200568135a7e"} Apr 22 15:58:57.230635 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.230604 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4j4vm" podStartSLOduration=4.112707501 podStartE2EDuration="21.230590433s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.892172598 +0000 UTC m=+3.327266272" lastFinishedPulling="2026-04-22 15:58:56.01005552 +0000 UTC m=+20.445149204" observedRunningTime="2026-04-22 15:58:57.230474327 +0000 UTC m=+21.665568023" watchObservedRunningTime="2026-04-22 15:58:57.230590433 +0000 UTC m=+21.665684128" Apr 22 15:58:57.242922 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.242880 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k6xqz" podStartSLOduration=4.145986951 podStartE2EDuration="21.242872108s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.892270691 +0000 UTC m=+3.327364380" lastFinishedPulling="2026-04-22 15:58:55.989155861 +0000 UTC m=+20.424249537" observedRunningTime="2026-04-22 15:58:57.242524442 +0000 UTC m=+21.677618152" watchObservedRunningTime="2026-04-22 15:58:57.242872108 +0000 UTC m=+21.677965804" Apr 22 15:58:57.256912 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.256878 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kksz7" podStartSLOduration=4.169781359 podStartE2EDuration="21.256870998s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.89855612 +0000 UTC m=+3.333649798" lastFinishedPulling="2026-04-22 15:58:55.985645747 +0000 UTC m=+20.420739437" observedRunningTime="2026-04-22 15:58:57.256748544 +0000 UTC m=+21.691842301" watchObservedRunningTime="2026-04-22 15:58:57.256870998 +0000 UTC m=+21.691964695" Apr 22 15:58:57.295391 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.295342 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9cj4f" podStartSLOduration=4.21556889 podStartE2EDuration="21.295326611s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.89231346 +0000 UTC m=+3.327407137" lastFinishedPulling="2026-04-22 15:58:55.972071166 +0000 UTC m=+20.407164858" observedRunningTime="2026-04-22 15:58:57.294670212 +0000 UTC m=+21.729763906" watchObservedRunningTime="2026-04-22 15:58:57.295326611 +0000 UTC m=+21.730420308" Apr 22 15:58:57.420935 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:57.420895 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:58:58.067504 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.066905 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:58:57.420912245Z","UUID":"4d6cdebb-a6ab-4a9f-95a0-7714c4cd67f0","Handler":null,"Name":"","Endpoint":""} Apr 22 15:58:58.068929 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.068542 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:58:58.068929 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.068581 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:58:58.121299 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.121271 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:58:58.121462 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.121382 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:58:58.121462 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.121271 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:58:58.121547 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:58.121490 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:58:58.121547 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:58.121380 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:58:58.121616 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:58:58.121577 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:58:58.227899 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.227859 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" event={"ID":"87be9666-efda-4ad0-a50e-fb952cc19860","Type":"ContainerStarted","Data":"fcb6f40c2e93e331ebbd07cb29c315d4a1d9bc904021d183bbb7be0ccf805c12"} Apr 22 15:58:58.229349 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.229319 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8dxdj" event={"ID":"d8b2518b-8eb8-4a43-bf50-6f370663ae7c","Type":"ContainerStarted","Data":"2bbf7b2a5058d24a5ee6df2fb20b23c782be05f99509426a69eb54e854dfcb7b"} Apr 22 15:58:58.242817 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:58.242769 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rlxkj" podStartSLOduration=5.162937116 podStartE2EDuration="22.242751999s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.89225499 +0000 UTC m=+3.327348674" lastFinishedPulling="2026-04-22 15:58:55.972069872 +0000 UTC m=+20.407163557" observedRunningTime="2026-04-22 15:58:57.311586516 +0000 UTC m=+21.746680214" watchObservedRunningTime="2026-04-22 15:58:58.242751999 +0000 UTC m=+22.677845719" Apr 22 15:58:59.234012 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:59.233927 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" event={"ID":"87be9666-efda-4ad0-a50e-fb952cc19860","Type":"ContainerStarted","Data":"3e5d1bd793f2b66a3f23b52437266a4789547da4e50bd21fce41be34eed0c765"} Apr 22 15:58:59.237849 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:59.237824 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 15:58:59.238326 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:59.238297 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"dbfccd4e935a49f8a67886ab5eb7bd78f69db49297d8ea69117db0cf3fa18d83"} Apr 22 15:58:59.249169 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:59.249130 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8dxdj" podStartSLOduration=6.163921639 podStartE2EDuration="23.249116418s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.898283791 +0000 UTC m=+3.333377480" lastFinishedPulling="2026-04-22 15:58:55.983478578 +0000 UTC m=+20.418572259" observedRunningTime="2026-04-22 15:58:58.244024199 +0000 UTC m=+22.679117888" watchObservedRunningTime="2026-04-22 15:58:59.249116418 +0000 UTC m=+23.684210111" Apr 22 15:58:59.249605 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:58:59.249579 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r6zxc" podStartSLOduration=3.263667518 podStartE2EDuration="23.249570377s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.873917335 +0000 UTC m=+3.309011011" lastFinishedPulling="2026-04-22 15:58:58.859820195 +0000 UTC m=+23.294913870" observedRunningTime="2026-04-22 15:58:59.249253901 +0000 UTC m=+23.684347580" watchObservedRunningTime="2026-04-22 15:58:59.249570377 +0000 UTC m=+23.684664073" Apr 22 15:59:00.121304 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:00.121275 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:00.121485 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:00.121275 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:00.121485 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:00.121410 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:59:00.121485 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:00.121285 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:00.121659 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:00.121468 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:59:00.121659 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:00.121595 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:59:01.949102 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:01.948931 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:59:01.949565 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:01.949514 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:59:02.121739 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.121654 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:02.121876 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.121652 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:02.121876 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.121803 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:02.121979 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:02.121867 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:59:02.121979 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:02.121768 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:59:02.121979 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:02.121938 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:59:02.246703 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.246674 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 15:59:02.247033 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.247006 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"d4a564c811b036fde9b7a839355f4ca46d43f29ef2f2d36f083d20390b4885fa"} Apr 22 15:59:02.247324 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.247296 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:02.247507 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.247490 2565 scope.go:117] "RemoveContainer" containerID="1d31d329eee6feaef6de4adb53658ed2aace985549ca0eb99ad4a1ecce0b0ab1" Apr 22 15:59:02.248732 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.248711 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="cdd5cd70bf0d4d1f4ff928f773b3aeb0c7d3720ae92a2d3c1ecbfc4e9d27a5df" exitCode=0 Apr 22 15:59:02.248805 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.248767 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"cdd5cd70bf0d4d1f4ff928f773b3aeb0c7d3720ae92a2d3c1ecbfc4e9d27a5df"} Apr 22 15:59:02.264214 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:02.264197 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:03.254104 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.254067 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 15:59:03.254497 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.254411 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" event={"ID":"4775e631-4da8-45cc-9fb4-6238451abe84","Type":"ContainerStarted","Data":"03710268165ad8125ea7bf09bb5dd38e8109783e37ac8ef673d89603ca491b9e"} Apr 22 15:59:03.254630 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.254612 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:03.254689 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.254640 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:03.256576 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.256552 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerStarted","Data":"97ddc34e53118170ea9b43ab53d037144ed404ab472f5a5ca382fb8ea37aa616"} Apr 22 15:59:03.269933 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.269913 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:03.303659 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.303616 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" podStartSLOduration=9.943249882 podStartE2EDuration="27.303602231s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.898700889 +0000 UTC m=+3.333794567" lastFinishedPulling="2026-04-22 15:58:56.25905324 +0000 UTC m=+20.694146916" observedRunningTime="2026-04-22 15:59:03.302298961 +0000 UTC m=+27.737392669" watchObservedRunningTime="2026-04-22 15:59:03.303602231 +0000 UTC m=+27.738695928" Apr 22 15:59:03.588279 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.588053 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h264g"] Apr 22 15:59:03.588427 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.588330 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:03.588427 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:03.588417 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:59:03.591082 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.591058 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57wkv"] Apr 22 15:59:03.591216 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.591172 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:03.591270 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:03.591249 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:59:03.591712 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.591686 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2nbv7"] Apr 22 15:59:03.591827 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:03.591811 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:03.591959 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:03.591941 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:59:04.260465 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:04.260431 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="97ddc34e53118170ea9b43ab53d037144ed404ab472f5a5ca382fb8ea37aa616" exitCode=0 Apr 22 15:59:04.260917 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:04.260499 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"97ddc34e53118170ea9b43ab53d037144ed404ab472f5a5ca382fb8ea37aa616"} Apr 22 15:59:05.120960 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.120934 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:05.121075 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.120942 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:05.121218 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:05.121065 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:59:05.121218 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:05.121153 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:59:05.121314 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.120943 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:05.121363 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:05.121308 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:59:05.264949 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.264918 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="7e2dea226445320e20df9b583ed13c6a69a8651b5d0bd2a4e6173c6cce8fd731" exitCode=0 Apr 22 15:59:05.265330 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.264992 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"7e2dea226445320e20df9b583ed13c6a69a8651b5d0bd2a4e6173c6cce8fd731"} Apr 22 15:59:05.965053 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.964965 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:59:05.965225 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.965154 2565 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:59:05.965684 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:05.965658 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rlxkj" Apr 22 15:59:07.121796 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:07.121763 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:07.121796 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:07.121794 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:07.122510 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:07.121802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:07.122510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:07.121877 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-57wkv" podUID="edee20cb-f531-4653-852f-f16cccf9f024" Apr 22 15:59:07.122510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:07.121950 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h264g" podUID="8b2db19d-176e-4219-830f-a3b6ed5a34e0" Apr 22 15:59:07.122510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:07.122039 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2nbv7" podUID="2cd47a51-d8a9-48f4-bf8e-d11d89cead22" Apr 22 15:59:08.934320 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.934247 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-152.ec2.internal" event="NodeReady" Apr 22 15:59:08.934786 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.934411 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:59:08.967051 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.967017 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 15:59:08.971547 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.971524 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:08.974202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.974175 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:59:08.974202 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.974187 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:59:08.974395 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.974213 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:59:08.974395 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.974187 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4mqmh\"" Apr 22 15:59:08.979070 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.978897 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:59:08.980016 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.979993 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 15:59:08.983410 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.983381 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hx8z7"] Apr 22 15:59:08.987549 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.987530 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4ph"] Apr 22 15:59:08.987709 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.987691 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:08.989718 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.989699 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:59:08.989818 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.989702 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:59:08.989971 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.989952 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 15:59:08.990789 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.990772 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:08.992000 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.991980 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hx8z7"] Apr 22 15:59:08.992781 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.992619 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:59:08.992781 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.992679 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 15:59:08.992940 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.992924 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:59:08.993001 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.992955 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:59:08.996044 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:08.996023 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4ph"] Apr 22 15:59:09.065902 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.065869 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj7rr\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066050 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.065912 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkz7s\" (UniqueName: \"kubernetes.io/projected/83a80857-8a6e-454e-be2b-ecc561993b6d-kube-api-access-fkz7s\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.066050 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.065970 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066196 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066074 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066196 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066165 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066297 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066226 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066297 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066251 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83a80857-8a6e-454e-be2b-ecc561993b6d-config-volume\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.066297 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066277 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.066424 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066350 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066424 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066395 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83a80857-8a6e-454e-be2b-ecc561993b6d-tmp-dir\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.066516 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.066573 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.066528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.121281 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.121237 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:09.121447 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.121237 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:09.121447 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.121237 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:09.123892 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.123861 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 15:59:09.123892 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.123861 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:59:09.124081 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.123944 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jgt28\"" Apr 22 15:59:09.124158 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.124120 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:59:09.124210 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.124193 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:59:09.124335 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.124317 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:59:09.167904 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.167877 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168052 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.167910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168052 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.167952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8s4\" (UniqueName: \"kubernetes.io/projected/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-kube-api-access-mq8s4\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.168052 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.167990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fj7rr\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168052 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168017 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkz7s\" (UniqueName: \"kubernetes.io/projected/83a80857-8a6e-454e-be2b-ecc561993b6d-kube-api-access-fkz7s\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.168253 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168218 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168306 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168306 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168296 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.168378 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168317 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168378 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168331 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168635 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168611 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168758 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168652 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83a80857-8a6e-454e-be2b-ecc561993b6d-config-volume\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.168758 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168685 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.168758 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168723 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.168758 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168745 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83a80857-8a6e-454e-be2b-ecc561993b6d-tmp-dir\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.168951 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.168931 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.169069 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.169052 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83a80857-8a6e-454e-be2b-ecc561993b6d-tmp-dir\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.169146 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.169122 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.169194 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.169159 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:09.169239 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.169210 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:09.169239 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.169220 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.669201983 +0000 UTC m=+34.104295658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:09.169239 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.169224 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:09.169340 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.169268 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.669252719 +0000 UTC m=+34.104346412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:09.169624 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.169605 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83a80857-8a6e-454e-be2b-ecc561993b6d-config-volume\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.172721 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.172596 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.172822 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.172596 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.177441 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.177420 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.177569 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.177549 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkz7s\" (UniqueName: \"kubernetes.io/projected/83a80857-8a6e-454e-be2b-ecc561993b6d-kube-api-access-fkz7s\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.177686 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.177666 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj7rr\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.269399 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.269365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8s4\" (UniqueName: \"kubernetes.io/projected/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-kube-api-access-mq8s4\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.269565 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.269420 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.269625 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.269565 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:09.269670 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.269638 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:09.769617657 +0000 UTC m=+34.204711342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:09.277868 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.277847 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8s4\" (UniqueName: \"kubernetes.io/projected/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-kube-api-access-mq8s4\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.672292 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.672208 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:09.672292 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.672247 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:09.672510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.672366 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:09.672510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.672373 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:09.672510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.672450 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:10.672430464 +0000 UTC m=+35.107524143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:09.672510 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.672379 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:09.672706 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.672522 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:10.672505428 +0000 UTC m=+35.107599116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:09.773475 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.773442 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:09.773797 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.773770 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:09.773904 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.773844 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:10.773825377 +0000 UTC m=+35.208919053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:09.874321 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.874278 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:09.874476 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.874331 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:09.874476 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.874381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:09.874476 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.874460 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:59:09.874618 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:09.874553 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:41.874532555 +0000 UTC m=+66.309626235 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : secret "metrics-daemon-secret" not found Apr 22 15:59:09.877547 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.877519 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8b2db19d-176e-4219-830f-a3b6ed5a34e0-original-pull-secret\") pod \"global-pull-secret-syncer-h264g\" (UID: \"8b2db19d-176e-4219-830f-a3b6ed5a34e0\") " pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:09.877719 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:09.877700 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmzmd\" (UniqueName: \"kubernetes.io/projected/edee20cb-f531-4653-852f-f16cccf9f024-kube-api-access-dmzmd\") pod \"network-check-target-57wkv\" (UID: \"edee20cb-f531-4653-852f-f16cccf9f024\") " pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:10.032528 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:10.032491 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:10.040359 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:10.040336 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h264g" Apr 22 15:59:10.680937 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:10.680905 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:10.681134 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:10.681009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:10.681134 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.681069 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:10.681134 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.681106 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:10.681273 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.681137 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:10.681273 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.681181 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:12.68116094 +0000 UTC m=+37.116254635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:10.681273 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.681201 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:12.681191113 +0000 UTC m=+37.116284794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:10.783534 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:10.782074 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:10.783534 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.782255 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:10.783534 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:10.782313 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:12.782296532 +0000 UTC m=+37.217390209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:11.014219 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:11.014188 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-57wkv"] Apr 22 15:59:11.017558 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:11.017535 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h264g"] Apr 22 15:59:11.064789 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:59:11.064760 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedee20cb_f531_4653_852f_f16cccf9f024.slice/crio-b1f80d2b2f781edcc6a54b688c4210f819e78c9fd089b7abacb236b529458cc4 WatchSource:0}: Error finding container b1f80d2b2f781edcc6a54b688c4210f819e78c9fd089b7abacb236b529458cc4: Status 404 returned error can't find the container with id b1f80d2b2f781edcc6a54b688c4210f819e78c9fd089b7abacb236b529458cc4 Apr 22 15:59:11.065137 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:59:11.065031 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b2db19d_176e_4219_830f_a3b6ed5a34e0.slice/crio-87309433e5af18032528060a0d03df2df0497cc5523e7bec3dbe533c40687af7 WatchSource:0}: Error finding container 87309433e5af18032528060a0d03df2df0497cc5523e7bec3dbe533c40687af7: Status 404 returned error can't find the container with id 87309433e5af18032528060a0d03df2df0497cc5523e7bec3dbe533c40687af7 Apr 22 15:59:11.277534 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:11.277501 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57wkv" event={"ID":"edee20cb-f531-4653-852f-f16cccf9f024","Type":"ContainerStarted","Data":"b1f80d2b2f781edcc6a54b688c4210f819e78c9fd089b7abacb236b529458cc4"} Apr 22 15:59:11.280210 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:11.280176 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerStarted","Data":"677aab521b33c8f4d73c4e396a599bf75391ca55eace68ebe7329ee11bb9a094"} Apr 22 15:59:11.281280 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:11.281258 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h264g" event={"ID":"8b2db19d-176e-4219-830f-a3b6ed5a34e0","Type":"ContainerStarted","Data":"87309433e5af18032528060a0d03df2df0497cc5523e7bec3dbe533c40687af7"} Apr 22 15:59:12.286607 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:12.286570 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="677aab521b33c8f4d73c4e396a599bf75391ca55eace68ebe7329ee11bb9a094" exitCode=0 Apr 22 15:59:12.287052 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:12.286632 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"677aab521b33c8f4d73c4e396a599bf75391ca55eace68ebe7329ee11bb9a094"} Apr 22 15:59:12.698472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:12.698064 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:12.698472 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:12.698349 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:12.698684 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.698214 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:12.698684 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.698573 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.698554923 +0000 UTC m=+41.133648604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:12.698938 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.698510 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:12.698938 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.698828 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:12.698938 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.698907 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.698891831 +0000 UTC m=+41.133985507 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:12.799681 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:12.799192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:12.799681 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.799334 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:12.799681 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:12.799397 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:16.799380604 +0000 UTC m=+41.234474285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:13.296506 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:13.296456 2565 generic.go:358] "Generic (PLEG): container finished" podID="98755c75-9268-4cfa-8cae-e8ccf20974be" containerID="fcc658352a22ef2404be49f667a30d5ffb525eb99b1135e0319c3c9446583d4e" exitCode=0 Apr 22 15:59:13.296908 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:13.296524 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerDied","Data":"fcc658352a22ef2404be49f667a30d5ffb525eb99b1135e0319c3c9446583d4e"} Apr 22 15:59:14.301872 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:14.301835 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" event={"ID":"98755c75-9268-4cfa-8cae-e8ccf20974be","Type":"ContainerStarted","Data":"9a4bc93b490cf88b2483309b3dfa6b95a08fd894f497d5e9a34b8037fb62fc05"} Apr 22 15:59:16.143999 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:16.143931 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c7hk6" podStartSLOduration=7.9155196960000005 podStartE2EDuration="40.143913714s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:58:38.869197005 +0000 UTC m=+3.304290695" lastFinishedPulling="2026-04-22 15:59:11.097591038 +0000 UTC m=+35.532684713" observedRunningTime="2026-04-22 15:59:14.323570348 +0000 UTC m=+38.758664080" watchObservedRunningTime="2026-04-22 15:59:16.143913714 +0000 UTC m=+40.579007413" Apr 22 15:59:16.319791 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:16.319663 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-57wkv" event={"ID":"edee20cb-f531-4653-852f-f16cccf9f024","Type":"ContainerStarted","Data":"df8dd08e0d102dae06fba3bfdbc80331721eadf8381f4d44dc36977f38415528"} Apr 22 15:59:16.727299 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:16.727258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:16.727299 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:16.727300 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:16.727471 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.727407 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:16.727471 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.727428 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:16.727471 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.727440 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:16.727579 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.727474 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:24.727458081 +0000 UTC m=+49.162551755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:16.727579 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.727488 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:24.727480943 +0000 UTC m=+49.162574618 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:16.827996 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:16.827898 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:16.828159 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.828048 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:16.828159 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:16.828126 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:24.828112055 +0000 UTC m=+49.263205731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:17.322666 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:17.322627 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h264g" event={"ID":"8b2db19d-176e-4219-830f-a3b6ed5a34e0","Type":"ContainerStarted","Data":"66c90bfd71717dfd88de111bf3674f36c6449d16c68386671a84f2f5d5c6a49a"} Apr 22 15:59:17.336848 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:17.336802 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h264g" podStartSLOduration=36.227166057 podStartE2EDuration="41.336790067s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:59:11.075307903 +0000 UTC m=+35.510401579" lastFinishedPulling="2026-04-22 15:59:16.184931914 +0000 UTC m=+40.620025589" observedRunningTime="2026-04-22 15:59:17.335845639 +0000 UTC m=+41.770939336" watchObservedRunningTime="2026-04-22 15:59:17.336790067 +0000 UTC m=+41.771883763" Apr 22 15:59:17.349934 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:17.349885 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-57wkv" podStartSLOduration=36.250569178 podStartE2EDuration="41.349872615s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 15:59:11.075252758 +0000 UTC m=+35.510346433" lastFinishedPulling="2026-04-22 15:59:16.174556196 +0000 UTC m=+40.609649870" observedRunningTime="2026-04-22 15:59:17.349202359 +0000 UTC m=+41.784296057" watchObservedRunningTime="2026-04-22 15:59:17.349872615 +0000 UTC m=+41.784966311" Apr 22 15:59:24.782551 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:24.782515 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:24.782551 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:24.782559 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:24.783133 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.782663 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:24.783133 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.782665 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:24.783133 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.782735 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 15:59:40.782720579 +0000 UTC m=+65.217814253 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:24.783133 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.782675 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:24.783133 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.782784 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 15:59:40.782773011 +0000 UTC m=+65.217866686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:24.883862 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:24.883830 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:24.884004 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.883946 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:24.884004 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:24.883998 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 15:59:40.883984846 +0000 UTC m=+65.319078525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:27.323499 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:27.323463 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:35.275858 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:35.275822 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-95pm2" Apr 22 15:59:40.784140 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:40.784102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 15:59:40.784140 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:40.784146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 15:59:40.784580 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.784218 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:59:40.784580 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.784236 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:59:40.784580 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.784246 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77995db544-fmqlq: secret "image-registry-tls" not found Apr 22 15:59:40.784580 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.784284 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls podName:83a80857-8a6e-454e-be2b-ecc561993b6d nodeName:}" failed. No retries permitted until 2026-04-22 16:00:12.784269145 +0000 UTC m=+97.219362820 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls") pod "dns-default-hx8z7" (UID: "83a80857-8a6e-454e-be2b-ecc561993b6d") : secret "dns-default-metrics-tls" not found Apr 22 15:59:40.784580 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.784298 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls podName:1b7616a8-fa8e-46e3-ac90-b510f706491e nodeName:}" failed. No retries permitted until 2026-04-22 16:00:12.784291926 +0000 UTC m=+97.219385602 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls") pod "image-registry-77995db544-fmqlq" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e") : secret "image-registry-tls" not found Apr 22 15:59:40.884741 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:40.884714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 15:59:40.884870 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.884846 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:59:40.884916 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:40.884902 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert podName:1480b1d8-29b2-4b9f-9d5a-04a492daadb0 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:12.884889674 +0000 UTC m=+97.319983350 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert") pod "ingress-canary-6s4ph" (UID: "1480b1d8-29b2-4b9f-9d5a-04a492daadb0") : secret "canary-serving-cert" not found Apr 22 15:59:41.891793 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:41.891758 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 15:59:41.892198 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:41.891882 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:59:41.892198 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:41.891945 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs podName:2cd47a51-d8a9-48f4-bf8e-d11d89cead22 nodeName:}" failed. No retries permitted until 2026-04-22 16:00:45.891931474 +0000 UTC m=+130.327025149 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs") pod "network-metrics-daemon-2nbv7" (UID: "2cd47a51-d8a9-48f4-bf8e-d11d89cead22") : secret "metrics-daemon-secret" not found Apr 22 15:59:47.326045 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:47.326010 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-57wkv" Apr 22 15:59:52.521353 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.521314 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nrtqr"] Apr 22 15:59:52.524571 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.524555 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.526761 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.526739 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 15:59:52.527002 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.526976 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:59:52.527682 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.527665 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 15:59:52.527800 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.527684 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6nv9h\"" Apr 22 15:59:52.527800 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.527673 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 15:59:52.532513 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.532495 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 15:59:52.533887 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.533868 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nrtqr"] Apr 22 15:59:52.567924 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.567902 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-config\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.568005 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.567937 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f4e084-b355-494f-955a-9d9d02e32cdb-serving-cert\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.568005 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.567971 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-trusted-ca\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.568005 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.567988 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqcp\" (UniqueName: \"kubernetes.io/projected/72f4e084-b355-494f-955a-9d9d02e32cdb-kube-api-access-pjqcp\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.669238 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.669140 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-config\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.669238 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.669179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f4e084-b355-494f-955a-9d9d02e32cdb-serving-cert\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.669404 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.669257 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-trusted-ca\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.669404 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.669286 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqcp\" (UniqueName: \"kubernetes.io/projected/72f4e084-b355-494f-955a-9d9d02e32cdb-kube-api-access-pjqcp\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.669807 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.669789 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-config\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.670028 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.670009 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72f4e084-b355-494f-955a-9d9d02e32cdb-trusted-ca\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.673573 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.673554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f4e084-b355-494f-955a-9d9d02e32cdb-serving-cert\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.677203 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.677181 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqcp\" (UniqueName: \"kubernetes.io/projected/72f4e084-b355-494f-955a-9d9d02e32cdb-kube-api-access-pjqcp\") pod \"console-operator-9d4b6777b-nrtqr\" (UID: \"72f4e084-b355-494f-955a-9d9d02e32cdb\") " pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.833853 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.833775 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 15:59:52.945500 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:52.945470 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nrtqr"] Apr 22 15:59:52.948321 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:59:52.948293 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f4e084_b355_494f_955a_9d9d02e32cdb.slice/crio-155ed4a1ff23d90468d69170d2626b8bbdcb7490f473fa174f18e64df21df0b9 WatchSource:0}: Error finding container 155ed4a1ff23d90468d69170d2626b8bbdcb7490f473fa174f18e64df21df0b9: Status 404 returned error can't find the container with id 155ed4a1ff23d90468d69170d2626b8bbdcb7490f473fa174f18e64df21df0b9 Apr 22 15:59:53.391563 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:53.391527 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" event={"ID":"72f4e084-b355-494f-955a-9d9d02e32cdb","Type":"ContainerStarted","Data":"155ed4a1ff23d90468d69170d2626b8bbdcb7490f473fa174f18e64df21df0b9"} Apr 22 15:59:54.017267 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:54.017235 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k6xqz_a8967a65-4b16-4099-9db4-ce8642ba6138/dns-node-resolver/0.log" Apr 22 15:59:54.816515 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:54.816485 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9cj4f_2b5899ec-33ba-45f8-b259-82f0af9723a4/node-ca/0.log" Apr 22 15:59:55.397134 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:55.397044 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/0.log" Apr 22 15:59:55.397134 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:55.397110 2565 generic.go:358] "Generic (PLEG): container finished" podID="72f4e084-b355-494f-955a-9d9d02e32cdb" containerID="564de8ed4aaedc92c4b01321bf14034449454a201e4cab29870c84ac36ba5ec9" exitCode=255 Apr 22 15:59:55.397504 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:55.397149 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" event={"ID":"72f4e084-b355-494f-955a-9d9d02e32cdb","Type":"ContainerDied","Data":"564de8ed4aaedc92c4b01321bf14034449454a201e4cab29870c84ac36ba5ec9"} Apr 22 15:59:55.397504 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:55.397367 2565 scope.go:117] "RemoveContainer" containerID="564de8ed4aaedc92c4b01321bf14034449454a201e4cab29870c84ac36ba5ec9" Apr 22 15:59:56.400602 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.400578 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/1.log" Apr 22 15:59:56.400977 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.400940 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/0.log" Apr 22 15:59:56.401014 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.400973 2565 generic.go:358] "Generic (PLEG): container finished" podID="72f4e084-b355-494f-955a-9d9d02e32cdb" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" exitCode=255 Apr 22 15:59:56.401050 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.401020 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" event={"ID":"72f4e084-b355-494f-955a-9d9d02e32cdb","Type":"ContainerDied","Data":"89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88"} Apr 22 15:59:56.401050 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.401046 2565 scope.go:117] "RemoveContainer" containerID="564de8ed4aaedc92c4b01321bf14034449454a201e4cab29870c84ac36ba5ec9" Apr 22 15:59:56.401275 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:56.401256 2565 scope.go:117] "RemoveContainer" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" Apr 22 15:59:56.401467 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:56.401448 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 15:59:57.404551 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:57.404519 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/1.log" Apr 22 15:59:57.404933 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:57.404826 2565 scope.go:117] "RemoveContainer" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" Apr 22 15:59:57.404999 ip-10-0-135-152 kubenswrapper[2565]: E0422 15:59:57.404971 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 15:59:59.341832 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.341798 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb"] Apr 22 15:59:59.345791 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.345775 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" Apr 22 15:59:59.347886 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.347867 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cr7lz\"" Apr 22 15:59:59.354382 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.354359 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb"] Apr 22 15:59:59.419245 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.419220 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blqdv\" (UniqueName: \"kubernetes.io/projected/f622db3f-32a0-443c-a69a-282794392cb2-kube-api-access-blqdv\") pod \"network-check-source-8894fc9bd-v59wb\" (UID: \"f622db3f-32a0-443c-a69a-282794392cb2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" Apr 22 15:59:59.519841 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.519804 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blqdv\" (UniqueName: \"kubernetes.io/projected/f622db3f-32a0-443c-a69a-282794392cb2-kube-api-access-blqdv\") pod \"network-check-source-8894fc9bd-v59wb\" (UID: \"f622db3f-32a0-443c-a69a-282794392cb2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" Apr 22 15:59:59.527051 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.527024 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blqdv\" (UniqueName: \"kubernetes.io/projected/f622db3f-32a0-443c-a69a-282794392cb2-kube-api-access-blqdv\") pod \"network-check-source-8894fc9bd-v59wb\" (UID: \"f622db3f-32a0-443c-a69a-282794392cb2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" Apr 22 15:59:59.653875 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.653802 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" Apr 22 15:59:59.763069 ip-10-0-135-152 kubenswrapper[2565]: I0422 15:59:59.763040 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb"] Apr 22 15:59:59.766208 ip-10-0-135-152 kubenswrapper[2565]: W0422 15:59:59.766181 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf622db3f_32a0_443c_a69a_282794392cb2.slice/crio-73b657224120e8e4a6a068fc7d74e4c2a5fa1125702e773108493522deb43a2f WatchSource:0}: Error finding container 73b657224120e8e4a6a068fc7d74e4c2a5fa1125702e773108493522deb43a2f: Status 404 returned error can't find the container with id 73b657224120e8e4a6a068fc7d74e4c2a5fa1125702e773108493522deb43a2f Apr 22 16:00:00.412527 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:00.412494 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" event={"ID":"f622db3f-32a0-443c-a69a-282794392cb2","Type":"ContainerStarted","Data":"41cba1f50b716ffb083b890d49f975155ca5cc1c1ef4c32e34f466a659198619"} Apr 22 16:00:00.412527 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:00.412528 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" event={"ID":"f622db3f-32a0-443c-a69a-282794392cb2","Type":"ContainerStarted","Data":"73b657224120e8e4a6a068fc7d74e4c2a5fa1125702e773108493522deb43a2f"} Apr 22 16:00:00.426404 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:00.426362 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-v59wb" podStartSLOduration=1.42634937 podStartE2EDuration="1.42634937s" podCreationTimestamp="2026-04-22 15:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:00:00.425675565 +0000 UTC m=+84.860769261" watchObservedRunningTime="2026-04-22 16:00:00.42634937 +0000 UTC m=+84.861443067" Apr 22 16:00:02.834913 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:02.834859 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:02.834913 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:02.834917 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:02.835450 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:02.835398 2565 scope.go:117] "RemoveContainer" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" Apr 22 16:00:02.835611 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:02.835591 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 16:00:12.818442 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.818386 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 16:00:12.818442 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.818440 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:00:12.820855 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.820835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"image-registry-77995db544-fmqlq\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:00:12.820922 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.820835 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83a80857-8a6e-454e-be2b-ecc561993b6d-metrics-tls\") pod \"dns-default-hx8z7\" (UID: \"83a80857-8a6e-454e-be2b-ecc561993b6d\") " pod="openshift-dns/dns-default-hx8z7" Apr 22 16:00:12.886323 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.886294 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4mqmh\"" Apr 22 16:00:12.895160 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.895142 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:00:12.900877 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.900856 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lm8hj\"" Apr 22 16:00:12.909965 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.909937 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hx8z7" Apr 22 16:00:12.919747 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.919722 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 16:00:12.922377 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:12.922354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1480b1d8-29b2-4b9f-9d5a-04a492daadb0-cert\") pod \"ingress-canary-6s4ph\" (UID: \"1480b1d8-29b2-4b9f-9d5a-04a492daadb0\") " pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 16:00:13.025438 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.025404 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 16:00:13.028146 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:13.028119 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7616a8_fa8e_46e3_ac90_b510f706491e.slice/crio-639b7b1687b16cfb66371b37983368a9a81c2f1ec21c8aa9840b5ce356aecc83 WatchSource:0}: Error finding container 639b7b1687b16cfb66371b37983368a9a81c2f1ec21c8aa9840b5ce356aecc83: Status 404 returned error can't find the container with id 639b7b1687b16cfb66371b37983368a9a81c2f1ec21c8aa9840b5ce356aecc83 Apr 22 16:00:13.038645 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.038611 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hx8z7"] Apr 22 16:00:13.041380 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:13.041358 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a80857_8a6e_454e_be2b_ecc561993b6d.slice/crio-3cde963217197722b3edf0cd53cacf5154453688bd44112eb1b5ed01e804e1d9 WatchSource:0}: Error finding container 3cde963217197722b3edf0cd53cacf5154453688bd44112eb1b5ed01e804e1d9: Status 404 returned error can't find the container with id 3cde963217197722b3edf0cd53cacf5154453688bd44112eb1b5ed01e804e1d9 Apr 22 16:00:13.208615 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.208583 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbbp8\"" Apr 22 16:00:13.216669 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.216650 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6s4ph" Apr 22 16:00:13.331068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.330990 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6s4ph"] Apr 22 16:00:13.333506 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:13.333479 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1480b1d8_29b2_4b9f_9d5a_04a492daadb0.slice/crio-1636d3137bf124561bca3179eaf80c2af4c49f6754364a04f61acdadab90c348 WatchSource:0}: Error finding container 1636d3137bf124561bca3179eaf80c2af4c49f6754364a04f61acdadab90c348: Status 404 returned error can't find the container with id 1636d3137bf124561bca3179eaf80c2af4c49f6754364a04f61acdadab90c348 Apr 22 16:00:13.442379 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.442343 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hx8z7" event={"ID":"83a80857-8a6e-454e-be2b-ecc561993b6d","Type":"ContainerStarted","Data":"3cde963217197722b3edf0cd53cacf5154453688bd44112eb1b5ed01e804e1d9"} Apr 22 16:00:13.443709 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.443675 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77995db544-fmqlq" event={"ID":"1b7616a8-fa8e-46e3-ac90-b510f706491e","Type":"ContainerStarted","Data":"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0"} Apr 22 16:00:13.443709 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.443708 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77995db544-fmqlq" event={"ID":"1b7616a8-fa8e-46e3-ac90-b510f706491e","Type":"ContainerStarted","Data":"639b7b1687b16cfb66371b37983368a9a81c2f1ec21c8aa9840b5ce356aecc83"} Apr 22 16:00:13.443913 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.443797 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:00:13.444777 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.444756 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6s4ph" event={"ID":"1480b1d8-29b2-4b9f-9d5a-04a492daadb0","Type":"ContainerStarted","Data":"1636d3137bf124561bca3179eaf80c2af4c49f6754364a04f61acdadab90c348"} Apr 22 16:00:13.462338 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:13.462294 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77995db544-fmqlq" podStartSLOduration=97.462282793 podStartE2EDuration="1m37.462282793s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:00:13.461326273 +0000 UTC m=+97.896420005" watchObservedRunningTime="2026-04-22 16:00:13.462282793 +0000 UTC m=+97.897376529" Apr 22 16:00:15.453595 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.453551 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hx8z7" event={"ID":"83a80857-8a6e-454e-be2b-ecc561993b6d","Type":"ContainerStarted","Data":"d805c8daa18b318ad9d27c23c1791ed6c79f17695049880c344b951deae49ec6"} Apr 22 16:00:15.453595 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.453592 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hx8z7" event={"ID":"83a80857-8a6e-454e-be2b-ecc561993b6d","Type":"ContainerStarted","Data":"8d476c793c6e463c6075c8386e625c72690362933e0478b2216a36716161816b"} Apr 22 16:00:15.454075 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.453673 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hx8z7" Apr 22 16:00:15.454895 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.454870 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6s4ph" event={"ID":"1480b1d8-29b2-4b9f-9d5a-04a492daadb0","Type":"ContainerStarted","Data":"6d0770701ed0ace34169b7ea45d64e89cfd15cbccfbe4f4d9a656ff07cb74c98"} Apr 22 16:00:15.470469 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.470424 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hx8z7" podStartSLOduration=66.032580307 podStartE2EDuration="1m7.470412251s" podCreationTimestamp="2026-04-22 15:59:08 +0000 UTC" firstStartedPulling="2026-04-22 16:00:13.043217016 +0000 UTC m=+97.478310691" lastFinishedPulling="2026-04-22 16:00:14.481048947 +0000 UTC m=+98.916142635" observedRunningTime="2026-04-22 16:00:15.470223069 +0000 UTC m=+99.905316790" watchObservedRunningTime="2026-04-22 16:00:15.470412251 +0000 UTC m=+99.905505947" Apr 22 16:00:15.483066 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:15.483021 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6s4ph" podStartSLOduration=65.487366935 podStartE2EDuration="1m7.483005465s" podCreationTimestamp="2026-04-22 15:59:08 +0000 UTC" firstStartedPulling="2026-04-22 16:00:13.335479596 +0000 UTC m=+97.770573272" lastFinishedPulling="2026-04-22 16:00:15.331118124 +0000 UTC m=+99.766211802" observedRunningTime="2026-04-22 16:00:15.482250129 +0000 UTC m=+99.917343827" watchObservedRunningTime="2026-04-22 16:00:15.483005465 +0000 UTC m=+99.918099163" Apr 22 16:00:18.122351 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.122320 2565 scope.go:117] "RemoveContainer" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" Apr 22 16:00:18.466540 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.466456 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:00:18.466864 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.466844 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/1.log" Apr 22 16:00:18.466967 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.466885 2565 generic.go:358] "Generic (PLEG): container finished" podID="72f4e084-b355-494f-955a-9d9d02e32cdb" containerID="77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5" exitCode=255 Apr 22 16:00:18.466967 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.466922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" event={"ID":"72f4e084-b355-494f-955a-9d9d02e32cdb","Type":"ContainerDied","Data":"77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5"} Apr 22 16:00:18.466967 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.466955 2565 scope.go:117] "RemoveContainer" containerID="89dbd6ca013e25d58884107b7a8cfaa6f3f776a90bd364a0e12383e1c1a18b88" Apr 22 16:00:18.467287 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:18.467270 2565 scope.go:117] "RemoveContainer" containerID="77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5" Apr 22 16:00:18.467461 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:18.467441 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 16:00:19.471101 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:19.471058 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:00:22.834105 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:22.834060 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:22.834473 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:22.834133 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:22.834473 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:22.834456 2565 scope.go:117] "RemoveContainer" containerID="77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5" Apr 22 16:00:22.834671 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:22.834647 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 16:00:25.421991 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.421951 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8"] Apr 22 16:00:25.424850 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.424835 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.427305 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.427286 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8x7c7\"" Apr 22 16:00:25.427411 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.427320 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 16:00:25.428208 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.428188 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 16:00:25.428373 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.428235 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 16:00:25.428373 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.428282 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 16:00:25.434619 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.434597 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8"] Apr 22 16:00:25.459493 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.459474 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hx8z7" Apr 22 16:00:25.504847 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.504819 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rrw\" (UniqueName: \"kubernetes.io/projected/e44c04b1-a0ad-4449-b798-16413aa65c45-kube-api-access-68rrw\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.505007 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.504854 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e44c04b1-a0ad-4449-b798-16413aa65c45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.518587 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.518550 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk"] Apr 22 16:00:25.521449 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.521427 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf"] Apr 22 16:00:25.521687 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.521666 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:25.523985 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.523968 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7n7dp\"" Apr 22 16:00:25.524080 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.523993 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 16:00:25.525520 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.525506 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.527493 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.527479 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 16:00:25.527589 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.527506 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 16:00:25.527653 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.527591 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-dhtpp\"" Apr 22 16:00:25.534057 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.532893 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-24bnd"] Apr 22 16:00:25.536018 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.536001 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf"] Apr 22 16:00:25.536018 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.536021 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk"] Apr 22 16:00:25.536195 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.536137 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.538363 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.538342 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:00:25.538483 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.538459 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 16:00:25.538949 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.538933 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:00:25.539056 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.539021 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lz7ck\"" Apr 22 16:00:25.539278 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.539264 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 16:00:25.547562 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.547530 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-24bnd"] Apr 22 16:00:25.605632 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605603 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mch\" (UniqueName: \"kubernetes.io/projected/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-api-access-42mch\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.605632 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605632 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d7a23691-2673-40b8-ae6c-5af9e81bc2d4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdfnk\" (UID: \"d7a23691-2673-40b8-ae6c-5af9e81bc2d4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:25.605800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605655 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.605800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605682 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.605800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605740 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.605800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605766 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-crio-socket\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.605800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605795 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68rrw\" (UniqueName: \"kubernetes.io/projected/e44c04b1-a0ad-4449-b798-16413aa65c45-kube-api-access-68rrw\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.605949 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605814 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e44c04b1-a0ad-4449-b798-16413aa65c45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.605949 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605848 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-data-volume\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.606010 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.605967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.608333 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.608308 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e44c04b1-a0ad-4449-b798-16413aa65c45-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.613772 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.613753 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rrw\" (UniqueName: \"kubernetes.io/projected/e44c04b1-a0ad-4449-b798-16413aa65c45-kube-api-access-68rrw\") pod \"managed-serviceaccount-addon-agent-b84f7b8bc-78tq8\" (UID: \"e44c04b1-a0ad-4449-b798-16413aa65c45\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.706685 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706605 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706685 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706642 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-crio-socket\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706685 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706662 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-data-volume\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706685 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706682 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.706962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42mch\" (UniqueName: \"kubernetes.io/projected/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-api-access-42mch\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d7a23691-2673-40b8-ae6c-5af9e81bc2d4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdfnk\" (UID: \"d7a23691-2673-40b8-ae6c-5af9e81bc2d4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:25.706962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-crio-socket\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.706962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.706774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.707237 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.707047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-data-volume\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.707407 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.707384 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.707502 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.707487 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.709153 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.709137 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/033e308b-fe58-4cc9-88b6-16bdf7f37e5b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zk6jf\" (UID: \"033e308b-fe58-4cc9-88b6-16bdf7f37e5b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.709494 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.709470 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.709590 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.709573 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/d7a23691-2673-40b8-ae6c-5af9e81bc2d4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hdfnk\" (UID: \"d7a23691-2673-40b8-ae6c-5af9e81bc2d4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:25.714315 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.714295 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mch\" (UniqueName: \"kubernetes.io/projected/11ec3c8d-9449-4a8d-ae4f-431815ccfd6f-kube-api-access-42mch\") pod \"insights-runtime-extractor-24bnd\" (UID: \"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f\") " pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.741483 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.741456 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" Apr 22 16:00:25.836217 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.836184 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:25.842988 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.842951 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" Apr 22 16:00:25.849623 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.849602 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-24bnd" Apr 22 16:00:25.855951 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.855924 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8"] Apr 22 16:00:25.858806 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:25.858771 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode44c04b1_a0ad_4449_b798_16413aa65c45.slice/crio-35b60e4ab78e44af2d9f68a1369ec6258bfa6939949d9a7092a3ef2bd71abce3 WatchSource:0}: Error finding container 35b60e4ab78e44af2d9f68a1369ec6258bfa6939949d9a7092a3ef2bd71abce3: Status 404 returned error can't find the container with id 35b60e4ab78e44af2d9f68a1369ec6258bfa6939949d9a7092a3ef2bd71abce3 Apr 22 16:00:25.985530 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:25.985502 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf"] Apr 22 16:00:25.988329 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:25.988299 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod033e308b_fe58_4cc9_88b6_16bdf7f37e5b.slice/crio-fbeced00bebf9500aeee139a01072506c080e1ed7cbdfab6f9499a088aa1713e WatchSource:0}: Error finding container fbeced00bebf9500aeee139a01072506c080e1ed7cbdfab6f9499a088aa1713e: Status 404 returned error can't find the container with id fbeced00bebf9500aeee139a01072506c080e1ed7cbdfab6f9499a088aa1713e Apr 22 16:00:26.239743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.239551 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk"] Apr 22 16:00:26.239923 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.239899 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-24bnd"] Apr 22 16:00:26.242029 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:26.242006 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a23691_2673_40b8_ae6c_5af9e81bc2d4.slice/crio-f6953501c0e5d85277919daf05ceb09ea30018ff27e0d34be049fd6ba72b559e WatchSource:0}: Error finding container f6953501c0e5d85277919daf05ceb09ea30018ff27e0d34be049fd6ba72b559e: Status 404 returned error can't find the container with id f6953501c0e5d85277919daf05ceb09ea30018ff27e0d34be049fd6ba72b559e Apr 22 16:00:26.244008 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:26.243985 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ec3c8d_9449_4a8d_ae4f_431815ccfd6f.slice/crio-421da58e36996ced78ee863eef706b5b861719463ab3f820dd4d4ece8d52ebe3 WatchSource:0}: Error finding container 421da58e36996ced78ee863eef706b5b861719463ab3f820dd4d4ece8d52ebe3: Status 404 returned error can't find the container with id 421da58e36996ced78ee863eef706b5b861719463ab3f820dd4d4ece8d52ebe3 Apr 22 16:00:26.490053 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.489956 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" event={"ID":"e44c04b1-a0ad-4449-b798-16413aa65c45","Type":"ContainerStarted","Data":"35b60e4ab78e44af2d9f68a1369ec6258bfa6939949d9a7092a3ef2bd71abce3"} Apr 22 16:00:26.491306 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.491278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24bnd" event={"ID":"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f","Type":"ContainerStarted","Data":"c5b93d21f795a7a8ead8f4be2ae25f7f604670b45795baf3220b839197c7e04c"} Apr 22 16:00:26.491428 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.491309 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24bnd" event={"ID":"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f","Type":"ContainerStarted","Data":"421da58e36996ced78ee863eef706b5b861719463ab3f820dd4d4ece8d52ebe3"} Apr 22 16:00:26.492265 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.492234 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" event={"ID":"033e308b-fe58-4cc9-88b6-16bdf7f37e5b","Type":"ContainerStarted","Data":"fbeced00bebf9500aeee139a01072506c080e1ed7cbdfab6f9499a088aa1713e"} Apr 22 16:00:26.493324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:26.493296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" event={"ID":"d7a23691-2673-40b8-ae6c-5af9e81bc2d4","Type":"ContainerStarted","Data":"f6953501c0e5d85277919daf05ceb09ea30018ff27e0d34be049fd6ba72b559e"} Apr 22 16:00:29.502921 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.502881 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24bnd" event={"ID":"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f","Type":"ContainerStarted","Data":"f7f45e5a15f3467a6de9a569cbe9abf3e75350726a5ebd4c5d3e86c3df794f91"} Apr 22 16:00:29.504343 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.504316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" event={"ID":"033e308b-fe58-4cc9-88b6-16bdf7f37e5b","Type":"ContainerStarted","Data":"e8c5a78944e2f4f8adac8429e6755a605fbc7754f1e155e56ca45775501df77a"} Apr 22 16:00:29.505740 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.505712 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" event={"ID":"d7a23691-2673-40b8-ae6c-5af9e81bc2d4","Type":"ContainerStarted","Data":"420e4e970f5809725355544c6aa4f594308cf3c0934ea9f9108b872624ba6567"} Apr 22 16:00:29.505936 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.505912 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:29.507330 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.507303 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" event={"ID":"e44c04b1-a0ad-4449-b798-16413aa65c45","Type":"ContainerStarted","Data":"389851df318c5d2208e0a6c541fe5935e7c212d530d42891a3ef9c9dd67d0146"} Apr 22 16:00:29.511581 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.511558 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" Apr 22 16:00:29.518359 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.518324 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zk6jf" podStartSLOduration=1.82148762 podStartE2EDuration="4.518314429s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:25.990510912 +0000 UTC m=+110.425604587" lastFinishedPulling="2026-04-22 16:00:28.687337717 +0000 UTC m=+113.122431396" observedRunningTime="2026-04-22 16:00:29.517783895 +0000 UTC m=+113.952877604" watchObservedRunningTime="2026-04-22 16:00:29.518314429 +0000 UTC m=+113.953408125" Apr 22 16:00:29.530881 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.530830 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-b84f7b8bc-78tq8" podStartSLOduration=1.6388100159999999 podStartE2EDuration="4.530818283s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:25.860749413 +0000 UTC m=+110.295843093" lastFinishedPulling="2026-04-22 16:00:28.752757671 +0000 UTC m=+113.187851360" observedRunningTime="2026-04-22 16:00:29.529970848 +0000 UTC m=+113.965064546" watchObservedRunningTime="2026-04-22 16:00:29.530818283 +0000 UTC m=+113.965911981" Apr 22 16:00:29.543511 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.543463 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hdfnk" podStartSLOduration=2.097444536 podStartE2EDuration="4.543452236s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:26.243754455 +0000 UTC m=+110.678848130" lastFinishedPulling="2026-04-22 16:00:28.689762138 +0000 UTC m=+113.124855830" observedRunningTime="2026-04-22 16:00:29.542943969 +0000 UTC m=+113.978037667" watchObservedRunningTime="2026-04-22 16:00:29.543452236 +0000 UTC m=+113.978545933" Apr 22 16:00:29.739843 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.739813 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhtgt"] Apr 22 16:00:29.744019 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.743998 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.746477 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746454 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 16:00:29.746588 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746483 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 16:00:29.746588 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746465 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:00:29.746588 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746547 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-phqmr\"" Apr 22 16:00:29.746588 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746560 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 16:00:29.746820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.746804 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 16:00:29.753848 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.753797 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhtgt"] Apr 22 16:00:29.838064 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.838017 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.838212 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.838102 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5xh7\" (UniqueName: \"kubernetes.io/projected/ebef7e67-a38a-414a-88b6-ee4a94e326fe-kube-api-access-x5xh7\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.838212 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.838172 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.838212 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.838198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebef7e67-a38a-414a-88b6-ee4a94e326fe-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.939582 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.939545 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.939747 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.939614 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5xh7\" (UniqueName: \"kubernetes.io/projected/ebef7e67-a38a-414a-88b6-ee4a94e326fe-kube-api-access-x5xh7\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.939747 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.939670 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.939747 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.939695 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebef7e67-a38a-414a-88b6-ee4a94e326fe-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.939918 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:29.939774 2565 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 16:00:29.939918 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:29.939847 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls podName:ebef7e67-a38a-414a-88b6-ee4a94e326fe nodeName:}" failed. No retries permitted until 2026-04-22 16:00:30.439826102 +0000 UTC m=+114.874919780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-dhtgt" (UID: "ebef7e67-a38a-414a-88b6-ee4a94e326fe") : secret "prometheus-operator-tls" not found Apr 22 16:00:29.940389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.940370 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebef7e67-a38a-414a-88b6-ee4a94e326fe-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.942339 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.942318 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:29.948382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:29.948364 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5xh7\" (UniqueName: \"kubernetes.io/projected/ebef7e67-a38a-414a-88b6-ee4a94e326fe-kube-api-access-x5xh7\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:30.447047 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.446959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:30.449932 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.449908 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebef7e67-a38a-414a-88b6-ee4a94e326fe-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dhtgt\" (UID: \"ebef7e67-a38a-414a-88b6-ee4a94e326fe\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:30.513860 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.513822 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24bnd" event={"ID":"11ec3c8d-9449-4a8d-ae4f-431815ccfd6f","Type":"ContainerStarted","Data":"a3e77526662452fe2cf29b34f3613c3ede014e6ab9266ea5bd2ce92418a10885"} Apr 22 16:00:30.530378 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.530335 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-24bnd" podStartSLOduration=1.6775760499999999 podStartE2EDuration="5.530321823s" podCreationTimestamp="2026-04-22 16:00:25 +0000 UTC" firstStartedPulling="2026-04-22 16:00:26.306675781 +0000 UTC m=+110.741769456" lastFinishedPulling="2026-04-22 16:00:30.159421553 +0000 UTC m=+114.594515229" observedRunningTime="2026-04-22 16:00:30.529388132 +0000 UTC m=+114.964481829" watchObservedRunningTime="2026-04-22 16:00:30.530321823 +0000 UTC m=+114.965415517" Apr 22 16:00:30.655496 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.655467 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" Apr 22 16:00:30.767915 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:30.767887 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dhtgt"] Apr 22 16:00:30.770626 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:30.770598 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebef7e67_a38a_414a_88b6_ee4a94e326fe.slice/crio-a393b16fbe97080fd88953d2bc530c07dc65b47ed01f0ed0c2eeaacca3c7fabe WatchSource:0}: Error finding container a393b16fbe97080fd88953d2bc530c07dc65b47ed01f0ed0c2eeaacca3c7fabe: Status 404 returned error can't find the container with id a393b16fbe97080fd88953d2bc530c07dc65b47ed01f0ed0c2eeaacca3c7fabe Apr 22 16:00:31.517734 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:31.517697 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" event={"ID":"ebef7e67-a38a-414a-88b6-ee4a94e326fe","Type":"ContainerStarted","Data":"a393b16fbe97080fd88953d2bc530c07dc65b47ed01f0ed0c2eeaacca3c7fabe"} Apr 22 16:00:32.522920 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:32.522882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" event={"ID":"ebef7e67-a38a-414a-88b6-ee4a94e326fe","Type":"ContainerStarted","Data":"fbd17111f8850f507296a70ca20630ee36b6f199bdf6240d6627f73b84a6104b"} Apr 22 16:00:32.522920 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:32.522919 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" event={"ID":"ebef7e67-a38a-414a-88b6-ee4a94e326fe","Type":"ContainerStarted","Data":"e2f683fc4556cafbc381b36781bf9652fa3bf5fe2ff8e3e7081b0245316593e3"} Apr 22 16:00:32.537427 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:32.537390 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-dhtgt" podStartSLOduration=2.318216224 podStartE2EDuration="3.537377288s" podCreationTimestamp="2026-04-22 16:00:29 +0000 UTC" firstStartedPulling="2026-04-22 16:00:30.77243839 +0000 UTC m=+115.207532065" lastFinishedPulling="2026-04-22 16:00:31.99159945 +0000 UTC m=+116.426693129" observedRunningTime="2026-04-22 16:00:32.537373235 +0000 UTC m=+116.972466934" watchObservedRunningTime="2026-04-22 16:00:32.537377288 +0000 UTC m=+116.972470978" Apr 22 16:00:32.899601 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:32.899514 2565 patch_prober.go:28] interesting pod/image-registry-77995db544-fmqlq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 16:00:32.899736 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:32.899573 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-77995db544-fmqlq" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 16:00:34.075886 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.075844 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5"] Apr 22 16:00:34.079044 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.079026 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.082227 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.082202 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 16:00:34.082341 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.082203 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-9ht46\"" Apr 22 16:00:34.082774 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.082413 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 16:00:34.088454 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.088433 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5"] Apr 22 16:00:34.105432 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.105407 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ctlfq"] Apr 22 16:00:34.108594 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.108572 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.111745 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.111727 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:00:34.111895 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.111879 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4m4rr\"" Apr 22 16:00:34.112161 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.112146 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:00:34.112298 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.112282 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:00:34.121770 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.121754 2565 scope.go:117] "RemoveContainer" containerID="77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5" Apr 22 16:00:34.121913 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:00:34.121898 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-nrtqr_openshift-console-operator(72f4e084-b355-494f-955a-9d9d02e32cdb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podUID="72f4e084-b355-494f-955a-9d9d02e32cdb" Apr 22 16:00:34.177340 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.177305 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.177480 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.177346 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.177480 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.177440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.177480 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.177475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqzp7\" (UniqueName: \"kubernetes.io/projected/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-kube-api-access-vqzp7\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.278614 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278579 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.278773 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278634 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqzp7\" (UniqueName: \"kubernetes.io/projected/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-kube-api-access-vqzp7\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.278773 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278663 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-metrics-client-ca\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.278866 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278842 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-tls\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.278955 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278937 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279008 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.278981 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-sys\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279041 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279034 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.279119 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.279173 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279115 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-root\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279260 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279240 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-wtmp\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279363 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279363 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279332 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-textfile\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279446 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgzv6\" (UniqueName: \"kubernetes.io/projected/7e9363b5-12aa-4223-bf69-f67620bf66d7-kube-api-access-sgzv6\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.279446 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.279403 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.281716 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.281693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.281766 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.281714 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.288219 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.288194 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqzp7\" (UniqueName: \"kubernetes.io/projected/2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca-kube-api-access-vqzp7\") pod \"openshift-state-metrics-9d44df66c-t89m5\" (UID: \"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.380555 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380481 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-textfile\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380555 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380517 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgzv6\" (UniqueName: \"kubernetes.io/projected/7e9363b5-12aa-4223-bf69-f67620bf66d7-kube-api-access-sgzv6\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380555 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380554 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-metrics-client-ca\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380578 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-tls\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380620 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380650 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-sys\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380728 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-sys\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380778 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-root\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380815 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-wtmp\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380843 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380889 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-root\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380902 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-textfile\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.380968 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.380959 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-wtmp\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.381272 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.381218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-metrics-client-ca\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.381343 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.381326 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-accelerators-collector-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.383155 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.383135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-tls\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.383251 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.383242 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e9363b5-12aa-4223-bf69-f67620bf66d7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.391195 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.391174 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" Apr 22 16:00:34.400672 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.400654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgzv6\" (UniqueName: \"kubernetes.io/projected/7e9363b5-12aa-4223-bf69-f67620bf66d7-kube-api-access-sgzv6\") pod \"node-exporter-ctlfq\" (UID: \"7e9363b5-12aa-4223-bf69-f67620bf66d7\") " pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.417745 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.417722 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ctlfq" Apr 22 16:00:34.425797 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:34.425761 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e9363b5_12aa_4223_bf69_f67620bf66d7.slice/crio-936f994f5e3a357d4d65bee3fdc88fdf7a90a36b80312a4760b88c2afb560980 WatchSource:0}: Error finding container 936f994f5e3a357d4d65bee3fdc88fdf7a90a36b80312a4760b88c2afb560980: Status 404 returned error can't find the container with id 936f994f5e3a357d4d65bee3fdc88fdf7a90a36b80312a4760b88c2afb560980 Apr 22 16:00:34.456744 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.456722 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:00:34.518668 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.518641 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5"] Apr 22 16:00:34.521619 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:34.521583 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3da36b_72fc_4c43_98f5_6cfe4e13c7ca.slice/crio-23dcb65f4a03511f75b6fb8492dad44017024f3d62eeddda06a285e4ea8efaa8 WatchSource:0}: Error finding container 23dcb65f4a03511f75b6fb8492dad44017024f3d62eeddda06a285e4ea8efaa8: Status 404 returned error can't find the container with id 23dcb65f4a03511f75b6fb8492dad44017024f3d62eeddda06a285e4ea8efaa8 Apr 22 16:00:34.529278 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.529252 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctlfq" event={"ID":"7e9363b5-12aa-4223-bf69-f67620bf66d7","Type":"ContainerStarted","Data":"936f994f5e3a357d4d65bee3fdc88fdf7a90a36b80312a4760b88c2afb560980"} Apr 22 16:00:34.530259 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:34.530227 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" event={"ID":"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca","Type":"ContainerStarted","Data":"23dcb65f4a03511f75b6fb8492dad44017024f3d62eeddda06a285e4ea8efaa8"} Apr 22 16:00:35.534626 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:35.534582 2565 generic.go:358] "Generic (PLEG): container finished" podID="7e9363b5-12aa-4223-bf69-f67620bf66d7" containerID="60adb7547986e07ea18fac8a37a236b677f2da52562f097e8f631f241e8e5c2c" exitCode=0 Apr 22 16:00:35.535044 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:35.534655 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctlfq" event={"ID":"7e9363b5-12aa-4223-bf69-f67620bf66d7","Type":"ContainerDied","Data":"60adb7547986e07ea18fac8a37a236b677f2da52562f097e8f631f241e8e5c2c"} Apr 22 16:00:35.536591 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:35.536570 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" event={"ID":"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca","Type":"ContainerStarted","Data":"031a083a222e3c970d2817f2e9130572097a355e7e95bb6423249d512c55010c"} Apr 22 16:00:35.536710 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:35.536598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" event={"ID":"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca","Type":"ContainerStarted","Data":"b03642f98608500829ffe0c3ad25b7d389c84f56420bb15b1ff92ed3b0b1e971"} Apr 22 16:00:36.540766 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:36.540722 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctlfq" event={"ID":"7e9363b5-12aa-4223-bf69-f67620bf66d7","Type":"ContainerStarted","Data":"ff7c41aa90fb47a99c6df9c7233c1de65e2945e62f73fa2a289aa4ac462104a4"} Apr 22 16:00:36.540766 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:36.540758 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ctlfq" event={"ID":"7e9363b5-12aa-4223-bf69-f67620bf66d7","Type":"ContainerStarted","Data":"272a420e689f15d0d02d5188e057194d0abad474cf87610aa9253296599d3b91"} Apr 22 16:00:36.542605 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:36.542575 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" event={"ID":"2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca","Type":"ContainerStarted","Data":"56942ae79b52967ff89dbbf518bb68c0919c00ad0ffea3405176e6015a7b8ae2"} Apr 22 16:00:36.559341 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:36.559294 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ctlfq" podStartSLOduration=1.8705581150000001 podStartE2EDuration="2.559281656s" podCreationTimestamp="2026-04-22 16:00:34 +0000 UTC" firstStartedPulling="2026-04-22 16:00:34.42757771 +0000 UTC m=+118.862671385" lastFinishedPulling="2026-04-22 16:00:35.116301251 +0000 UTC m=+119.551394926" observedRunningTime="2026-04-22 16:00:36.558144732 +0000 UTC m=+120.993238429" watchObservedRunningTime="2026-04-22 16:00:36.559281656 +0000 UTC m=+120.994375378" Apr 22 16:00:36.573690 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:36.573644 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-t89m5" podStartSLOduration=1.5644155899999999 podStartE2EDuration="2.573632302s" podCreationTimestamp="2026-04-22 16:00:34 +0000 UTC" firstStartedPulling="2026-04-22 16:00:34.65289477 +0000 UTC m=+119.087988444" lastFinishedPulling="2026-04-22 16:00:35.662111477 +0000 UTC m=+120.097205156" observedRunningTime="2026-04-22 16:00:36.572358628 +0000 UTC m=+121.007452325" watchObservedRunningTime="2026-04-22 16:00:36.573632302 +0000 UTC m=+121.008725998" Apr 22 16:00:37.169798 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.169766 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb"] Apr 22 16:00:37.174367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.174346 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.176586 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.176569 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 16:00:37.176720 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.176701 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 16:00:37.176791 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.176701 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hl9vh\"" Apr 22 16:00:37.176849 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.176789 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 16:00:37.176849 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.176790 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ak3t38fabbdf1\"" Apr 22 16:00:37.177212 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.177167 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 16:00:37.181133 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.181110 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 16:00:37.187996 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.187974 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb"] Apr 22 16:00:37.304522 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304493 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304522 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304531 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-grpc-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304741 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304638 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86rtk\" (UniqueName: \"kubernetes.io/projected/00a88686-6ce6-4405-bd51-ea862bbae204-kube-api-access-86rtk\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304741 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304665 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304741 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304688 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00a88686-6ce6-4405-bd51-ea862bbae204-metrics-client-ca\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304872 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304770 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304872 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304789 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.304872 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.304832 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405591 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405557 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86rtk\" (UniqueName: \"kubernetes.io/projected/00a88686-6ce6-4405-bd51-ea862bbae204-kube-api-access-86rtk\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405591 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405615 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00a88686-6ce6-4405-bd51-ea862bbae204-metrics-client-ca\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405650 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405669 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.405840 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.405789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-grpc-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.406838 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.406793 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00a88686-6ce6-4405-bd51-ea862bbae204-metrics-client-ca\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.408548 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.408504 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-grpc-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.408548 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.408524 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.408821 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.408802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-tls\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.408952 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.408928 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.409024 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.408935 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.409076 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.409030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00a88686-6ce6-4405-bd51-ea862bbae204-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.412607 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.412585 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86rtk\" (UniqueName: \"kubernetes.io/projected/00a88686-6ce6-4405-bd51-ea862bbae204-kube-api-access-86rtk\") pod \"thanos-querier-695bbcdf7d-5tzxb\" (UID: \"00a88686-6ce6-4405-bd51-ea862bbae204\") " pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.487655 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.487586 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:37.610810 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:37.610779 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb"] Apr 22 16:00:37.612733 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:37.612706 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a88686_6ce6_4405_bd51_ea862bbae204.slice/crio-79906eb8cb6d9a235423bac30b604bddadaa9ef31b275ef3cbcf9183a507dd2d WatchSource:0}: Error finding container 79906eb8cb6d9a235423bac30b604bddadaa9ef31b275ef3cbcf9183a507dd2d: Status 404 returned error can't find the container with id 79906eb8cb6d9a235423bac30b604bddadaa9ef31b275ef3cbcf9183a507dd2d Apr 22 16:00:38.512046 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.511975 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-74dfc75f77-4s8jm"] Apr 22 16:00:38.515390 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.515363 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.518609 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.518583 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 16:00:38.518746 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.518696 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 16:00:38.518871 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.518588 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hlxjr\"" Apr 22 16:00:38.518925 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.518910 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fr9pa6likisf4\"" Apr 22 16:00:38.519152 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.519129 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 16:00:38.519282 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.519170 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 16:00:38.524510 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.524488 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74dfc75f77-4s8jm"] Apr 22 16:00:38.551344 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.551309 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"79906eb8cb6d9a235423bac30b604bddadaa9ef31b275ef3cbcf9183a507dd2d"} Apr 22 16:00:38.614781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.614747 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-tls\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.614781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.614784 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/64aca963-edf8-457f-990b-6fd0e03348b1-audit-log\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.615324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.614817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.615324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.614932 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-metrics-server-audit-profiles\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.615324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.614974 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-client-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.615324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.615009 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-client-certs\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.615324 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.615066 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlnl7\" (UniqueName: \"kubernetes.io/projected/64aca963-edf8-457f-990b-6fd0e03348b1-kube-api-access-dlnl7\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716518 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716479 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-metrics-server-audit-profiles\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716518 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716520 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-client-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716720 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716549 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-client-certs\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716720 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716618 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlnl7\" (UniqueName: \"kubernetes.io/projected/64aca963-edf8-457f-990b-6fd0e03348b1-kube-api-access-dlnl7\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716830 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-tls\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716830 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716749 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/64aca963-edf8-457f-990b-6fd0e03348b1-audit-log\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.716830 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.716784 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.717487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.717437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/64aca963-edf8-457f-990b-6fd0e03348b1-audit-log\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.717769 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.717746 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.718626 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.718601 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/64aca963-edf8-457f-990b-6fd0e03348b1-metrics-server-audit-profiles\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.719456 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.719428 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-client-ca-bundle\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.719571 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.719509 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-client-certs\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.719670 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.719648 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/64aca963-edf8-457f-990b-6fd0e03348b1-secret-metrics-server-tls\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.723796 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.723768 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlnl7\" (UniqueName: \"kubernetes.io/projected/64aca963-edf8-457f-990b-6fd0e03348b1-kube-api-access-dlnl7\") pod \"metrics-server-74dfc75f77-4s8jm\" (UID: \"64aca963-edf8-457f-990b-6fd0e03348b1\") " pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:38.828190 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:38.828120 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:39.293227 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.293196 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5997968445-m59rp"] Apr 22 16:00:39.297791 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.297769 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.300597 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300373 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 16:00:39.300597 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300392 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 16:00:39.300597 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300433 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 16:00:39.300771 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300646 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 16:00:39.300817 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300808 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-q4mfp\"" Apr 22 16:00:39.300966 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.300951 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 16:00:39.304777 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.304758 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 16:00:39.306349 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.306330 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5997968445-m59rp"] Apr 22 16:00:39.329053 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.329009 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74dfc75f77-4s8jm"] Apr 22 16:00:39.330928 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:39.330908 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64aca963_edf8_457f_990b_6fd0e03348b1.slice/crio-162e51a6fa43f5126b61295db36610450db22d9609ff5b6e7118089b776d9b8a WatchSource:0}: Error finding container 162e51a6fa43f5126b61295db36610450db22d9609ff5b6e7118089b776d9b8a: Status 404 returned error can't find the container with id 162e51a6fa43f5126b61295db36610450db22d9609ff5b6e7118089b776d9b8a Apr 22 16:00:39.421901 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.421864 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-metrics-client-ca\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.421992 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.421918 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.421992 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.421940 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.421992 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.421958 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.421992 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.421988 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pldqf\" (UniqueName: \"kubernetes.io/projected/3b837bd3-a08f-465c-a109-a195a45ffccb-kube-api-access-pldqf\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.422159 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.422135 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.422203 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.422177 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-federate-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.422238 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.422216 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-serving-certs-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523193 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523164 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-metrics-client-ca\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523342 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523208 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523342 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523324 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523426 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523363 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523426 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523393 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pldqf\" (UniqueName: \"kubernetes.io/projected/3b837bd3-a08f-465c-a109-a195a45ffccb-kube-api-access-pldqf\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523495 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523547 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523509 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-federate-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523597 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523553 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-serving-certs-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.523895 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.523849 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-metrics-client-ca\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.524335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.524270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-serving-certs-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.524335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.524298 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.526031 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.526007 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-federate-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.526120 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.526029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-telemeter-client-tls\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.526261 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.526245 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.526303 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.526257 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b837bd3-a08f-465c-a109-a195a45ffccb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.530804 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.530782 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pldqf\" (UniqueName: \"kubernetes.io/projected/3b837bd3-a08f-465c-a109-a195a45ffccb-kube-api-access-pldqf\") pod \"telemeter-client-5997968445-m59rp\" (UID: \"3b837bd3-a08f-465c-a109-a195a45ffccb\") " pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.555129 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.555101 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" event={"ID":"64aca963-edf8-457f-990b-6fd0e03348b1","Type":"ContainerStarted","Data":"162e51a6fa43f5126b61295db36610450db22d9609ff5b6e7118089b776d9b8a"} Apr 22 16:00:39.556648 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.556627 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"f5471e702c72d2796b867066f9626b7b0b5798176a4e09c2b07771b8561280ca"} Apr 22 16:00:39.556737 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.556654 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"33cceecabbb7ca1abf709746ced8f04ccf32da787664b6fc487f9338a4d628f4"} Apr 22 16:00:39.556737 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.556664 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"b21201af59dec1446ebfd376aea922e5513d560ff4494623be4551afadd6e0cf"} Apr 22 16:00:39.620963 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.620939 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" Apr 22 16:00:39.739198 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:39.739034 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5997968445-m59rp"] Apr 22 16:00:39.741729 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:39.741696 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b837bd3_a08f_465c_a109_a195a45ffccb.slice/crio-24fef0231a6fd83a98f35772a3b9ad07b1207ff8b880d5a64c56c22dd1061c5d WatchSource:0}: Error finding container 24fef0231a6fd83a98f35772a3b9ad07b1207ff8b880d5a64c56c22dd1061c5d: Status 404 returned error can't find the container with id 24fef0231a6fd83a98f35772a3b9ad07b1207ff8b880d5a64c56c22dd1061c5d Apr 22 16:00:40.397860 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.397519 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:00:40.403167 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.403142 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.407887 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.407930 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408003 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408113 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.407896 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408161 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408225 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 16:00:40.408335 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408338 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 16:00:40.408824 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408414 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 16:00:40.408824 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.408607 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 16:00:40.409139 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.409119 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1di3j4jh2ntqg\"" Apr 22 16:00:40.409358 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.409341 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qdbw9\"" Apr 22 16:00:40.409555 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.409536 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 16:00:40.417350 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.416313 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 16:00:40.421372 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.420835 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.537925 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538005 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-872r5\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538045 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538073 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538202 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538233 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538279 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538307 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538389 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538425 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538453 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538481 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538520 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538578 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.538969 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.538603 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.564193 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.564110 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"2c5282debc01f03bc93e65660f9694ef4c49df8563ab2bc3f886bbb89da659fd"} Apr 22 16:00:40.564193 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.564157 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"a65ce5663b38e240e75213a98684ef4fe288522de0df77f0d98c8fa96e254dba"} Apr 22 16:00:40.565441 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.565411 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" event={"ID":"3b837bd3-a08f-465c-a109-a195a45ffccb","Type":"ContainerStarted","Data":"24fef0231a6fd83a98f35772a3b9ad07b1207ff8b880d5a64c56c22dd1061c5d"} Apr 22 16:00:40.639894 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.639605 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.639894 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.639653 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.639894 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.639815 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.642286 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.641718 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.642286 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.641808 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-872r5\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.642286 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.641855 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.645779 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.643802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646756 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646809 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646847 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646877 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646914 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.646959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647005 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647034 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647108 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647179 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.647743 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.647201 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.651706 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.652036 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.653207 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.655516 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.656558 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.656828 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.658068 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.657832 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.659478 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.659456 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.659738 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.659716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.659823 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.659742 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.660626 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.660411 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.661176 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.661121 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.661542 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.661442 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.662658 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.662614 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.663209 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.663190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.663781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.663767 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-872r5\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.665126 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.664881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:40.724504 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:40.724466 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:41.060440 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.060418 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:00:41.465033 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:41.465000 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5b8f4f_d7cb_4508_88a5_3c9de5ef2aff.slice/crio-f1dbdd7c0e638865ba0f831fb6216966bfbab93f225390074c434e02f9202381 WatchSource:0}: Error finding container f1dbdd7c0e638865ba0f831fb6216966bfbab93f225390074c434e02f9202381: Status 404 returned error can't find the container with id f1dbdd7c0e638865ba0f831fb6216966bfbab93f225390074c434e02f9202381 Apr 22 16:00:41.570945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.570915 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" event={"ID":"00a88686-6ce6-4405-bd51-ea862bbae204","Type":"ContainerStarted","Data":"c3cfac6dfa846028c23594d9866fefd43f326874e429375646975620a29b73c4"} Apr 22 16:00:41.571129 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.571113 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:41.572216 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.572193 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"f1dbdd7c0e638865ba0f831fb6216966bfbab93f225390074c434e02f9202381"} Apr 22 16:00:41.573625 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.573605 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" event={"ID":"64aca963-edf8-457f-990b-6fd0e03348b1","Type":"ContainerStarted","Data":"1dd0f42f537f11f8d096dce64b6c41645199f8d1df082a24e274b4e26475ff6b"} Apr 22 16:00:41.593017 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.592977 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" podStartSLOduration=2.060592944 podStartE2EDuration="4.592962729s" podCreationTimestamp="2026-04-22 16:00:37 +0000 UTC" firstStartedPulling="2026-04-22 16:00:37.614667712 +0000 UTC m=+122.049761386" lastFinishedPulling="2026-04-22 16:00:40.147037481 +0000 UTC m=+124.582131171" observedRunningTime="2026-04-22 16:00:41.590863614 +0000 UTC m=+126.025957311" watchObservedRunningTime="2026-04-22 16:00:41.592962729 +0000 UTC m=+126.028056472" Apr 22 16:00:41.606575 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:41.606530 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" podStartSLOduration=1.984039855 podStartE2EDuration="3.606516762s" podCreationTimestamp="2026-04-22 16:00:38 +0000 UTC" firstStartedPulling="2026-04-22 16:00:39.332723594 +0000 UTC m=+123.767817268" lastFinishedPulling="2026-04-22 16:00:40.955200497 +0000 UTC m=+125.390294175" observedRunningTime="2026-04-22 16:00:41.605405256 +0000 UTC m=+126.040498954" watchObservedRunningTime="2026-04-22 16:00:41.606516762 +0000 UTC m=+126.041610440" Apr 22 16:00:42.580544 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:42.580510 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" event={"ID":"3b837bd3-a08f-465c-a109-a195a45ffccb","Type":"ContainerStarted","Data":"787931d0d69dc15968e5b988dfd939cf810346322a36c83d18ba00b862635d64"} Apr 22 16:00:43.584534 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:43.584491 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" event={"ID":"3b837bd3-a08f-465c-a109-a195a45ffccb","Type":"ContainerStarted","Data":"69910e7de0e49b28a03d36aceccc04f740afbf04bb2814ee095e34b0e84d7107"} Apr 22 16:00:43.584966 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:43.584540 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" event={"ID":"3b837bd3-a08f-465c-a109-a195a45ffccb","Type":"ContainerStarted","Data":"3e844a129076892b0736b7de03c4c8f9562190ba36143441b82161d41cce3adf"} Apr 22 16:00:43.585794 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:43.585765 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820" exitCode=0 Apr 22 16:00:43.585897 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:43.585830 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820"} Apr 22 16:00:43.626951 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:43.626894 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5997968445-m59rp" podStartSLOduration=1.815233131 podStartE2EDuration="4.62687912s" podCreationTimestamp="2026-04-22 16:00:39 +0000 UTC" firstStartedPulling="2026-04-22 16:00:39.74355182 +0000 UTC m=+124.178645498" lastFinishedPulling="2026-04-22 16:00:42.555197808 +0000 UTC m=+126.990291487" observedRunningTime="2026-04-22 16:00:43.625299572 +0000 UTC m=+128.060393287" watchObservedRunningTime="2026-04-22 16:00:43.62687912 +0000 UTC m=+128.061972817" Apr 22 16:00:45.905414 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:45.905364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 16:00:45.908038 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:45.908013 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2cd47a51-d8a9-48f4-bf8e-d11d89cead22-metrics-certs\") pod \"network-metrics-daemon-2nbv7\" (UID: \"2cd47a51-d8a9-48f4-bf8e-d11d89cead22\") " pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 16:00:46.049327 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.049293 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-52jwv\"" Apr 22 16:00:46.057160 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.057132 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2nbv7" Apr 22 16:00:46.207722 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.207696 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2nbv7"] Apr 22 16:00:46.210618 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:00:46.210590 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd47a51_d8a9_48f4_bf8e_d11d89cead22.slice/crio-85a42ffb26a106d5b49c53e0fc6266a687253cd2944c9ab98ebf80ea1fd274f0 WatchSource:0}: Error finding container 85a42ffb26a106d5b49c53e0fc6266a687253cd2944c9ab98ebf80ea1fd274f0: Status 404 returned error can't find the container with id 85a42ffb26a106d5b49c53e0fc6266a687253cd2944c9ab98ebf80ea1fd274f0 Apr 22 16:00:46.595279 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.595243 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2nbv7" event={"ID":"2cd47a51-d8a9-48f4-bf8e-d11d89cead22","Type":"ContainerStarted","Data":"85a42ffb26a106d5b49c53e0fc6266a687253cd2944c9ab98ebf80ea1fd274f0"} Apr 22 16:00:46.598060 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0"} Apr 22 16:00:46.598189 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598067 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1"} Apr 22 16:00:46.598189 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598082 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916"} Apr 22 16:00:46.598189 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598113 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb"} Apr 22 16:00:46.598189 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39"} Apr 22 16:00:46.598189 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.598139 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerStarted","Data":"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54"} Apr 22 16:00:46.623605 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:46.623553 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.950255094 podStartE2EDuration="6.623534261s" podCreationTimestamp="2026-04-22 16:00:40 +0000 UTC" firstStartedPulling="2026-04-22 16:00:41.467037742 +0000 UTC m=+125.902131420" lastFinishedPulling="2026-04-22 16:00:46.140316909 +0000 UTC m=+130.575410587" observedRunningTime="2026-04-22 16:00:46.621822594 +0000 UTC m=+131.056916304" watchObservedRunningTime="2026-04-22 16:00:46.623534261 +0000 UTC m=+131.058627991" Apr 22 16:00:47.587065 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:47.586988 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-695bbcdf7d-5tzxb" Apr 22 16:00:47.602813 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:47.602785 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2nbv7" event={"ID":"2cd47a51-d8a9-48f4-bf8e-d11d89cead22","Type":"ContainerStarted","Data":"b7e0fb2c1c0c9eabcc3bdce95aec99826fd185236095381977cb1eba9d5a2d08"} Apr 22 16:00:47.602813 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:47.602816 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2nbv7" event={"ID":"2cd47a51-d8a9-48f4-bf8e-d11d89cead22","Type":"ContainerStarted","Data":"b8ed7cc2d9395db7f4bf4ab6eba4383f426079ac46c4ae814ceb43c21e910c5a"} Apr 22 16:00:47.620802 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:47.620749 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2nbv7" podStartSLOduration=130.518650067 podStartE2EDuration="2m11.620731382s" podCreationTimestamp="2026-04-22 15:58:36 +0000 UTC" firstStartedPulling="2026-04-22 16:00:46.212521296 +0000 UTC m=+130.647614970" lastFinishedPulling="2026-04-22 16:00:47.314602597 +0000 UTC m=+131.749696285" observedRunningTime="2026-04-22 16:00:47.619444135 +0000 UTC m=+132.054537834" watchObservedRunningTime="2026-04-22 16:00:47.620731382 +0000 UTC m=+132.055825083" Apr 22 16:00:47.960407 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:47.960328 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 16:00:49.122123 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.122054 2565 scope.go:117] "RemoveContainer" containerID="77e97d8a044d8cc9b1d63eba35aedccbc356ae8c05c04976099dd50eaccb99a5" Apr 22 16:00:49.612399 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.612370 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:00:49.612564 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.612447 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" event={"ID":"72f4e084-b355-494f-955a-9d9d02e32cdb","Type":"ContainerStarted","Data":"8096fa5ac53b36610e8927caebf4558a2d80cd80fe593822db21d6ec32ad30b8"} Apr 22 16:00:49.612747 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.612718 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:49.617422 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.617403 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" Apr 22 16:00:49.630696 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:49.630641 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-nrtqr" podStartSLOduration=55.482070181 podStartE2EDuration="57.630626236s" podCreationTimestamp="2026-04-22 15:59:52 +0000 UTC" firstStartedPulling="2026-04-22 15:59:52.94998384 +0000 UTC m=+77.385077515" lastFinishedPulling="2026-04-22 15:59:55.098539892 +0000 UTC m=+79.533633570" observedRunningTime="2026-04-22 16:00:49.63031551 +0000 UTC m=+134.065409208" watchObservedRunningTime="2026-04-22 16:00:49.630626236 +0000 UTC m=+134.065719935" Apr 22 16:00:50.725057 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:50.725024 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:00:58.828890 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:58.828854 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:00:58.828890 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:00:58.828895 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:01:12.979738 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:12.979668 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-77995db544-fmqlq" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerName="registry" containerID="cri-o://7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0" gracePeriod=30 Apr 22 16:01:13.226269 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.226247 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:01:13.344487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344450 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344490 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344535 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344571 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344619 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344657 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj7rr\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344712 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.344965 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.344745 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token\") pod \"1b7616a8-fa8e-46e3-ac90-b510f706491e\" (UID: \"1b7616a8-fa8e-46e3-ac90-b510f706491e\") " Apr 22 16:01:13.345395 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.345175 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:13.345952 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.345903 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:13.347514 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.347490 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:13.347618 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.347594 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:13.347710 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.347689 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:13.347761 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.347726 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr" (OuterVolumeSpecName: "kube-api-access-fj7rr") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "kube-api-access-fj7rr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:13.347820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.347799 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:13.357112 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.357066 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1b7616a8-fa8e-46e3-ac90-b510f706491e" (UID: "1b7616a8-fa8e-46e3-ac90-b510f706491e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:01:13.445608 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445583 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-tls\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445608 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445604 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1b7616a8-fa8e-46e3-ac90-b510f706491e-ca-trust-extracted\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445615 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-trusted-ca\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445624 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fj7rr\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-kube-api-access-fj7rr\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445633 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1b7616a8-fa8e-46e3-ac90-b510f706491e-registry-certificates\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445642 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b7616a8-fa8e-46e3-ac90-b510f706491e-bound-sa-token\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445651 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-installation-pull-secrets\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.445748 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.445661 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1b7616a8-fa8e-46e3-ac90-b510f706491e-image-registry-private-configuration\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:13.692718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.692630 2565 generic.go:358] "Generic (PLEG): container finished" podID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerID="7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0" exitCode=0 Apr 22 16:01:13.692863 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.692726 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77995db544-fmqlq" Apr 22 16:01:13.692863 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.692718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77995db544-fmqlq" event={"ID":"1b7616a8-fa8e-46e3-ac90-b510f706491e","Type":"ContainerDied","Data":"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0"} Apr 22 16:01:13.692863 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.692841 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77995db544-fmqlq" event={"ID":"1b7616a8-fa8e-46e3-ac90-b510f706491e","Type":"ContainerDied","Data":"639b7b1687b16cfb66371b37983368a9a81c2f1ec21c8aa9840b5ce356aecc83"} Apr 22 16:01:13.693002 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.692867 2565 scope.go:117] "RemoveContainer" containerID="7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0" Apr 22 16:01:13.701129 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.701110 2565 scope.go:117] "RemoveContainer" containerID="7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0" Apr 22 16:01:13.701371 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:13.701352 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0\": container with ID starting with 7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0 not found: ID does not exist" containerID="7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0" Apr 22 16:01:13.701410 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.701378 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0"} err="failed to get container status \"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0\": rpc error: code = NotFound desc = could not find container \"7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0\": container with ID starting with 7fd3d9439fe190e7bd73b6d2895ab544fdcacf6912587672dbe97575331579e0 not found: ID does not exist" Apr 22 16:01:13.714123 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.714078 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 16:01:13.718028 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:13.718007 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-77995db544-fmqlq"] Apr 22 16:01:14.125953 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:14.125922 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" path="/var/lib/kubelet/pods/1b7616a8-fa8e-46e3-ac90-b510f706491e/volumes" Apr 22 16:01:18.834072 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:18.834038 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:01:18.837861 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:18.837837 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-74dfc75f77-4s8jm" Apr 22 16:01:40.725527 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:40.725474 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:40.745936 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:40.745909 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:40.790439 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:40.790412 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:58.705717 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.705679 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:01:58.706403 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706368 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="thanos-sidecar" containerID="cri-o://e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb" gracePeriod=600 Apr 22 16:01:58.706504 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706432 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0" gracePeriod=600 Apr 22 16:01:58.706574 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706379 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy" containerID="cri-o://02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1" gracePeriod=600 Apr 22 16:01:58.706574 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706384 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="config-reloader" containerID="cri-o://9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39" gracePeriod=600 Apr 22 16:01:58.706687 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706403 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-web" containerID="cri-o://90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916" gracePeriod=600 Apr 22 16:01:58.706687 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.706425 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="prometheus" containerID="cri-o://f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54" gracePeriod=600 Apr 22 16:01:58.831133 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831102 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0" exitCode=0 Apr 22 16:01:58.831222 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831124 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0"} Apr 22 16:01:58.831222 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831172 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916"} Apr 22 16:01:58.831222 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831134 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916" exitCode=0 Apr 22 16:01:58.831222 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831201 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb" exitCode=0 Apr 22 16:01:58.831222 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831214 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39" exitCode=0 Apr 22 16:01:58.831382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831225 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54" exitCode=0 Apr 22 16:01:58.831382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831291 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb"} Apr 22 16:01:58.831382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39"} Apr 22 16:01:58.831382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.831329 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54"} Apr 22 16:01:58.959037 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:58.958972 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.012439 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012390 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012439 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012439 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012692 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012488 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012692 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012673 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012723 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012800 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012765 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012904 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012809 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012904 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012838 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.012904 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012838 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:59.012904 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012863 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012902 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012950 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.012978 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013011 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013036 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013106 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-872r5\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013396 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013133 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013396 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013161 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013396 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013187 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file\") pod \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\" (UID: \"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff\") " Apr 22 16:01:59.013524 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013438 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-metrics-client-ca\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.013718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.013696 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:01:59.014591 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.014557 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:59.015865 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.015810 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.015865 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.015901 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.016198 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.016020 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.016410 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.016385 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:59.016528 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.016502 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out" (OuterVolumeSpecName: "config-out") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:01:59.017382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.017353 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:59.017962 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.017933 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:01:59.018054 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.017973 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.018791 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.018761 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.019031 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.018983 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config" (OuterVolumeSpecName: "config") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.019206 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.019132 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:59.019600 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.019573 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5" (OuterVolumeSpecName: "kube-api-access-872r5") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "kube-api-access-872r5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:01:59.020435 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.020406 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.020793 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.020770 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.032254 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.032231 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config" (OuterVolumeSpecName: "web-config") pod "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" (UID: "bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:01:59.113820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113788 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-db\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.113820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113816 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.113820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113826 2565 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-kube-rbac-proxy\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113837 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113847 2565 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-metrics-client-certs\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113859 2565 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113868 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113879 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113887 2565 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-config-out\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113895 2565 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-web-config\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113903 2565 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-grpc-tls\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113910 2565 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113919 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-872r5\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-kube-api-access-872r5\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113927 2565 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113936 2565 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-tls-assets\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113944 2565 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.114036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.113952 2565 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-152.ec2.internal\" DevicePath \"\"" Apr 22 16:01:59.837122 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.837072 2565 generic.go:358] "Generic (PLEG): container finished" podID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerID="02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1" exitCode=0 Apr 22 16:01:59.837525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.837139 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1"} Apr 22 16:01:59.837525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.837166 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff","Type":"ContainerDied","Data":"f1dbdd7c0e638865ba0f831fb6216966bfbab93f225390074c434e02f9202381"} Apr 22 16:01:59.837525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.837181 2565 scope.go:117] "RemoveContainer" containerID="d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0" Apr 22 16:01:59.837525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.837219 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.845103 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.844888 2565 scope.go:117] "RemoveContainer" containerID="02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1" Apr 22 16:01:59.851503 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.851487 2565 scope.go:117] "RemoveContainer" containerID="90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916" Apr 22 16:01:59.857938 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.857921 2565 scope.go:117] "RemoveContainer" containerID="e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb" Apr 22 16:01:59.859708 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.859688 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:01:59.863954 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.863934 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:01:59.865207 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.865189 2565 scope.go:117] "RemoveContainer" containerID="9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39" Apr 22 16:01:59.871570 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.871551 2565 scope.go:117] "RemoveContainer" containerID="f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54" Apr 22 16:01:59.878123 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.878079 2565 scope.go:117] "RemoveContainer" containerID="cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820" Apr 22 16:01:59.885104 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885063 2565 scope.go:117] "RemoveContainer" containerID="d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0" Apr 22 16:01:59.885400 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885381 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:01:59.885482 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.885451 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0\": container with ID starting with d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0 not found: ID does not exist" containerID="d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0" Apr 22 16:01:59.885549 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885476 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0"} err="failed to get container status \"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0\": rpc error: code = NotFound desc = could not find container \"d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0\": container with ID starting with d4ec76585db5afc9624bb3a58921fe5389e0cf1f087175c9f156d0fe2b08f4e0 not found: ID does not exist" Apr 22 16:01:59.885549 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885494 2565 scope.go:117] "RemoveContainer" containerID="02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1" Apr 22 16:01:59.885739 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885726 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="thanos-sidecar" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885742 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="thanos-sidecar" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885755 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="config-reloader" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885761 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="config-reloader" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885768 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="init-config-reloader" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885773 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="init-config-reloader" Apr 22 16:01:59.885781 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885780 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885786 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885794 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerName="registry" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885799 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerName="registry" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.885729 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1\": container with ID starting with 02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1 not found: ID does not exist" containerID="02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885828 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1"} err="failed to get container status \"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1\": rpc error: code = NotFound desc = could not find container \"02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1\": container with ID starting with 02cf79fd86e33c4212774e53730287550265709f6b532df9aeb8e3045dd542b1 not found: ID does not exist" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885846 2565 scope.go:117] "RemoveContainer" containerID="90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885810 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-web" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885887 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-web" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885906 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-thanos" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885915 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-thanos" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885928 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="prometheus" Apr 22 16:01:59.885945 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.885936 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="prometheus" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886025 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="config-reloader" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886037 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-web" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886046 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886053 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="thanos-sidecar" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886060 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="prometheus" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886066 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" containerName="kube-rbac-proxy-thanos" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886074 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b7616a8-fa8e-46e3-ac90-b510f706491e" containerName="registry" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.886076 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916\": container with ID starting with 90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916 not found: ID does not exist" containerID="90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886124 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916"} err="failed to get container status \"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916\": rpc error: code = NotFound desc = could not find container \"90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916\": container with ID starting with 90bdd94185f6bc23c3319e2f1a80116c5b694658170515def90a5813a0751916 not found: ID does not exist" Apr 22 16:01:59.886367 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886141 2565 scope.go:117] "RemoveContainer" containerID="e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.886378 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb\": container with ID starting with e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb not found: ID does not exist" containerID="e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886394 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb"} err="failed to get container status \"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb\": rpc error: code = NotFound desc = could not find container \"e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb\": container with ID starting with e0b4fff3b09b755de202748101e3205b55ab9f0ac94ab6ba2130b5e7e965efbb not found: ID does not exist" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886408 2565 scope.go:117] "RemoveContainer" containerID="9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.886605 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39\": container with ID starting with 9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39 not found: ID does not exist" containerID="9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886625 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39"} err="failed to get container status \"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39\": rpc error: code = NotFound desc = could not find container \"9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39\": container with ID starting with 9035f1bf94f096c0f034d97b18528e94e29528a1545997299097e396510bfd39 not found: ID does not exist" Apr 22 16:01:59.886708 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886640 2565 scope.go:117] "RemoveContainer" containerID="f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54" Apr 22 16:01:59.887035 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.886850 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54\": container with ID starting with f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54 not found: ID does not exist" containerID="f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54" Apr 22 16:01:59.887035 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886864 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54"} err="failed to get container status \"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54\": rpc error: code = NotFound desc = could not find container \"f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54\": container with ID starting with f27c002baf801cb19033cb2ff62ba89f8b88eef0ca997a7c26bd2c22a20e8e54 not found: ID does not exist" Apr 22 16:01:59.887035 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.886878 2565 scope.go:117] "RemoveContainer" containerID="cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820" Apr 22 16:01:59.887239 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:01:59.887142 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820\": container with ID starting with cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820 not found: ID does not exist" containerID="cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820" Apr 22 16:01:59.887239 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.887162 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820"} err="failed to get container status \"cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820\": rpc error: code = NotFound desc = could not find container \"cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820\": container with ID starting with cb8786ebf6798f793abbfc35e0985ac9031fe3c5bb4e992593ac38e317774820 not found: ID does not exist" Apr 22 16:01:59.891847 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.891832 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.894109 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894075 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 16:01:59.894225 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894162 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 16:01:59.894225 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894173 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 16:01:59.894225 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894214 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 16:01:59.894437 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894213 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 16:01:59.894437 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894162 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 16:01:59.894627 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894608 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 16:01:59.894778 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894725 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 16:01:59.894778 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894740 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1di3j4jh2ntqg\"" Apr 22 16:01:59.894986 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.894956 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 16:01:59.895081 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.895018 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 16:01:59.895081 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.895034 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qdbw9\"" Apr 22 16:01:59.896903 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.896886 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 16:01:59.900281 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.900246 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:01:59.902200 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.902178 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 16:01:59.920395 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920371 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920518 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920518 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920421 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920518 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920439 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920682 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920526 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920682 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920584 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920682 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920831 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920678 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920831 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920831 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920831 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920770 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.920831 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920797 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8kxh\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-kube-api-access-w8kxh\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920837 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920877 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920900 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920936 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920973 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:01:59.921067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:01:59.920997 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.021707 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021672 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.021858 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021731 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.021858 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021785 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.021858 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.021858 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021845 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021878 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021904 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021927 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8kxh\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-kube-api-access-w8kxh\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.021956 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022048 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022076 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022183 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022242 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022270 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.022711 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022682 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.023143 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.022799 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.023254 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.023227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.023323 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.023273 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.024202 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.023908 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.024202 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.024055 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.024772 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.024710 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.025210 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.025158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.025313 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.025271 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.025691 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.025641 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.026168 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.026135 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027006 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.026981 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027350 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.027309 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027441 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.027387 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027718 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.027701 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027794 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.027775 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8629a6c0-975f-44bb-946e-c5676e4bf5ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.027985 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.027964 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8629a6c0-975f-44bb-946e-c5676e4bf5ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.028681 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.028663 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.029149 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.029130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8629a6c0-975f-44bb-946e-c5676e4bf5ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.030331 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.030311 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8kxh\" (UniqueName: \"kubernetes.io/projected/8629a6c0-975f-44bb-946e-c5676e4bf5ed-kube-api-access-w8kxh\") pod \"prometheus-k8s-0\" (UID: \"8629a6c0-975f-44bb-946e-c5676e4bf5ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.124856 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.124811 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff" path="/var/lib/kubelet/pods/bc5b8f4f-d7cb-4508-88a5-3c9de5ef2aff/volumes" Apr 22 16:02:00.202647 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.202613 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:02:00.327118 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.327071 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:02:00.328902 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:02:00.328876 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8629a6c0_975f_44bb_946e_c5676e4bf5ed.slice/crio-0487828de21d53f44907c05b7f745005dc5d790bc0cc3cc6c4447c54dcfcdedc WatchSource:0}: Error finding container 0487828de21d53f44907c05b7f745005dc5d790bc0cc3cc6c4447c54dcfcdedc: Status 404 returned error can't find the container with id 0487828de21d53f44907c05b7f745005dc5d790bc0cc3cc6c4447c54dcfcdedc Apr 22 16:02:00.842446 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.842407 2565 generic.go:358] "Generic (PLEG): container finished" podID="8629a6c0-975f-44bb-946e-c5676e4bf5ed" containerID="1eed8d3a56d026e61b7bcc390a66464e4dce0635479363175f204b321751045e" exitCode=0 Apr 22 16:02:00.842922 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.842485 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerDied","Data":"1eed8d3a56d026e61b7bcc390a66464e4dce0635479363175f204b321751045e"} Apr 22 16:02:00.842922 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:00.842509 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"0487828de21d53f44907c05b7f745005dc5d790bc0cc3cc6c4447c54dcfcdedc"} Apr 22 16:02:01.847514 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847468 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"2d8b621a0b2136bb7b9c00d07932cc1d1e0a2e04c61da38896eef438687dc047"} Apr 22 16:02:01.847514 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847516 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"686aafc8857fc76856efcc394ee21c304adfd92b6b24b678b1f8b4062de7e815"} Apr 22 16:02:01.847907 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847525 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"b24f09d26d58f99bfac227fb89937edf2012a6aaa1ebd4059e3ef09691571391"} Apr 22 16:02:01.847907 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"bdd82cad7ea6c99a2bd53b9a7e35c558659f3d1c101033bf7db3bb94d1b31663"} Apr 22 16:02:01.847907 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847543 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"ffc568c764e371b0711c020f3c43607590403c45f4058eb58ee97771c16f3367"} Apr 22 16:02:01.847907 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.847551 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8629a6c0-975f-44bb-946e-c5676e4bf5ed","Type":"ContainerStarted","Data":"01d2e7dac21876265e2e4db0e808cf6f70b7df9f08394956ffc5dba4fed9ed1a"} Apr 22 16:02:01.873030 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:01.872973 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.87295894 podStartE2EDuration="2.87295894s" podCreationTimestamp="2026-04-22 16:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:02:01.870973452 +0000 UTC m=+206.306067149" watchObservedRunningTime="2026-04-22 16:02:01.87295894 +0000 UTC m=+206.308052637" Apr 22 16:02:05.202977 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:02:05.202937 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:03:00.202976 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:00.202899 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:03:00.218637 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:00.218614 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:03:01.027669 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:01.027641 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:03:36.008140 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:36.008106 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:03:36.008963 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:36.008943 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:03:36.010651 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:36.010632 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:03:36.011550 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:36.011530 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:03:36.017780 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:03:36.017760 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:05:13.803714 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.803672 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8ghjt"] Apr 22 16:05:13.806788 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.806771 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:13.809677 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.809647 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-2gmf5\"" Apr 22 16:05:13.810082 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.810064 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:05:13.810506 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.810488 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:05:13.830808 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.830779 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8ghjt"] Apr 22 16:05:13.920979 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.920947 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcskr\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-kube-api-access-mcskr\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:13.921178 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:13.921003 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-bound-sa-token\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.021491 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.021453 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcskr\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-kube-api-access-mcskr\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.021668 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.021511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-bound-sa-token\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.029040 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.029011 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-bound-sa-token\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.029205 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.029125 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcskr\" (UniqueName: \"kubernetes.io/projected/0f8bcffd-b015-4d59-b00c-38685953a2e5-kube-api-access-mcskr\") pod \"cert-manager-79c8d999ff-8ghjt\" (UID: \"0f8bcffd-b015-4d59-b00c-38685953a2e5\") " pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.116938 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.116832 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-8ghjt" Apr 22 16:05:14.241506 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.241479 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8ghjt"] Apr 22 16:05:14.243991 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:05:14.243954 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f8bcffd_b015_4d59_b00c_38685953a2e5.slice/crio-cd1cd2bb6b36c30cd9d027a2aace48b43f1e92bddd732ee06b0e2ff1566bf8c1 WatchSource:0}: Error finding container cd1cd2bb6b36c30cd9d027a2aace48b43f1e92bddd732ee06b0e2ff1566bf8c1: Status 404 returned error can't find the container with id cd1cd2bb6b36c30cd9d027a2aace48b43f1e92bddd732ee06b0e2ff1566bf8c1 Apr 22 16:05:14.245686 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.245668 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:05:14.389463 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:14.389370 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-8ghjt" event={"ID":"0f8bcffd-b015-4d59-b00c-38685953a2e5","Type":"ContainerStarted","Data":"cd1cd2bb6b36c30cd9d027a2aace48b43f1e92bddd732ee06b0e2ff1566bf8c1"} Apr 22 16:05:17.399878 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:17.399843 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-8ghjt" event={"ID":"0f8bcffd-b015-4d59-b00c-38685953a2e5","Type":"ContainerStarted","Data":"4309e475b41cd44b821148239c3f609d43208e28b0f1b9c91d926cf30e008a41"} Apr 22 16:05:17.417468 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:17.417416 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-8ghjt" podStartSLOduration=1.629678738 podStartE2EDuration="4.417401525s" podCreationTimestamp="2026-04-22 16:05:13 +0000 UTC" firstStartedPulling="2026-04-22 16:05:14.245802681 +0000 UTC m=+398.680896357" lastFinishedPulling="2026-04-22 16:05:17.033525453 +0000 UTC m=+401.468619144" observedRunningTime="2026-04-22 16:05:17.416264027 +0000 UTC m=+401.851357723" watchObservedRunningTime="2026-04-22 16:05:17.417401525 +0000 UTC m=+401.852495224" Apr 22 16:05:45.786042 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.786005 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45"] Apr 22 16:05:45.790113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.790078 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.792481 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.792456 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 16:05:45.792481 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.792477 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 16:05:45.792641 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.792477 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 16:05:45.792641 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.792477 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:05:45.793468 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.793441 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-2mt2p\"" Apr 22 16:05:45.793507 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.793491 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 16:05:45.797583 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.797564 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45"] Apr 22 16:05:45.891234 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.891198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.891425 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.891253 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfclg\" (UniqueName: \"kubernetes.io/projected/81c02492-94cc-4b38-9784-c30031e4d287-kube-api-access-bfclg\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.891425 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.891348 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.891425 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.891387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81c02492-94cc-4b38-9784-c30031e4d287-manager-config\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.992033 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.991996 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.992214 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.992059 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfclg\" (UniqueName: \"kubernetes.io/projected/81c02492-94cc-4b38-9784-c30031e4d287-kube-api-access-bfclg\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.992214 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.992119 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.992214 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.992144 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81c02492-94cc-4b38-9784-c30031e4d287-manager-config\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.992928 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.992846 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81c02492-94cc-4b38-9784-c30031e4d287-manager-config\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.994763 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.994741 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-metrics-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:45.994763 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:45.994759 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81c02492-94cc-4b38-9784-c30031e4d287-cert\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:46.001826 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:46.001770 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfclg\" (UniqueName: \"kubernetes.io/projected/81c02492-94cc-4b38-9784-c30031e4d287-kube-api-access-bfclg\") pod \"lws-controller-manager-7d868c4d86-r8z45\" (UID: \"81c02492-94cc-4b38-9784-c30031e4d287\") " pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:46.099570 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:46.099481 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:46.220364 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:46.220199 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45"] Apr 22 16:05:46.223244 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:05:46.223201 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c02492_94cc_4b38_9784_c30031e4d287.slice/crio-259d6c73d700c4f430f4f17773d0a35d0142e6ca7ca01a5e4c78c3220d14784d WatchSource:0}: Error finding container 259d6c73d700c4f430f4f17773d0a35d0142e6ca7ca01a5e4c78c3220d14784d: Status 404 returned error can't find the container with id 259d6c73d700c4f430f4f17773d0a35d0142e6ca7ca01a5e4c78c3220d14784d Apr 22 16:05:46.490818 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:46.490737 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" event={"ID":"81c02492-94cc-4b38-9784-c30031e4d287","Type":"ContainerStarted","Data":"259d6c73d700c4f430f4f17773d0a35d0142e6ca7ca01a5e4c78c3220d14784d"} Apr 22 16:05:50.503455 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:50.503409 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" event={"ID":"81c02492-94cc-4b38-9784-c30031e4d287","Type":"ContainerStarted","Data":"bcd9186b18a178011e5fd9518c65a5513bb09588d0d5e5459aeba134976d5c2d"} Apr 22 16:05:50.503850 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:50.503514 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:05:50.518124 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:50.518055 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" podStartSLOduration=2.269106215 podStartE2EDuration="5.518040007s" podCreationTimestamp="2026-04-22 16:05:45 +0000 UTC" firstStartedPulling="2026-04-22 16:05:46.224907976 +0000 UTC m=+430.660001651" lastFinishedPulling="2026-04-22 16:05:49.473841765 +0000 UTC m=+433.908935443" observedRunningTime="2026-04-22 16:05:50.517456935 +0000 UTC m=+434.952550632" watchObservedRunningTime="2026-04-22 16:05:50.518040007 +0000 UTC m=+434.953133704" Apr 22 16:05:53.005600 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.005511 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr"] Apr 22 16:05:53.007788 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.007771 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.010307 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.010283 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:05:53.010307 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.010300 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pj4nw\"" Apr 22 16:05:53.010487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.010285 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:05:53.010487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.010285 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:05:53.010487 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.010294 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:05:53.021500 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.021477 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr"] Apr 22 16:05:53.051356 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.051329 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.051492 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.051386 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48qlf\" (UniqueName: \"kubernetes.io/projected/8196a26a-5eed-4067-9591-67935be9b123-kube-api-access-48qlf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.051492 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.051410 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.152186 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.152151 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.152345 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.152199 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48qlf\" (UniqueName: \"kubernetes.io/projected/8196a26a-5eed-4067-9591-67935be9b123-kube-api-access-48qlf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.152345 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.152221 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.154746 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.154721 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.154870 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.154791 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8196a26a-5eed-4067-9591-67935be9b123-webhook-cert\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.159833 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.159804 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48qlf\" (UniqueName: \"kubernetes.io/projected/8196a26a-5eed-4067-9591-67935be9b123-kube-api-access-48qlf\") pod \"opendatahub-operator-controller-manager-54dfb4598d-sjklr\" (UID: \"8196a26a-5eed-4067-9591-67935be9b123\") " pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.318233 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.318204 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:53.451882 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.451856 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr"] Apr 22 16:05:53.454768 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:05:53.454737 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8196a26a_5eed_4067_9591_67935be9b123.slice/crio-2c4aebf5bcb1264c33587609f5b3d29cbfb160ed27ff11635b523fddb6de16df WatchSource:0}: Error finding container 2c4aebf5bcb1264c33587609f5b3d29cbfb160ed27ff11635b523fddb6de16df: Status 404 returned error can't find the container with id 2c4aebf5bcb1264c33587609f5b3d29cbfb160ed27ff11635b523fddb6de16df Apr 22 16:05:53.513767 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:53.513722 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" event={"ID":"8196a26a-5eed-4067-9591-67935be9b123","Type":"ContainerStarted","Data":"2c4aebf5bcb1264c33587609f5b3d29cbfb160ed27ff11635b523fddb6de16df"} Apr 22 16:05:56.526069 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:56.526037 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" event={"ID":"8196a26a-5eed-4067-9591-67935be9b123","Type":"ContainerStarted","Data":"9a54e5ad5de7053f53cf495e67921d77f9964fa8299e429ef5d860b6a7915c19"} Apr 22 16:05:56.526490 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:56.526230 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:05:56.545242 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:05:56.545188 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" podStartSLOduration=1.98930026 podStartE2EDuration="4.545167789s" podCreationTimestamp="2026-04-22 16:05:52 +0000 UTC" firstStartedPulling="2026-04-22 16:05:53.456404897 +0000 UTC m=+437.891498573" lastFinishedPulling="2026-04-22 16:05:56.012272411 +0000 UTC m=+440.447366102" observedRunningTime="2026-04-22 16:05:56.543387724 +0000 UTC m=+440.978481421" watchObservedRunningTime="2026-04-22 16:05:56.545167789 +0000 UTC m=+440.980261487" Apr 22 16:06:01.509001 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:01.508959 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7d868c4d86-r8z45" Apr 22 16:06:07.532653 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:07.532622 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54dfb4598d-sjklr" Apr 22 16:06:12.106146 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.106106 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l"] Apr 22 16:06:12.113486 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.113460 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.118525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.118494 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 16:06:12.120548 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.120215 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-kdnln\"" Apr 22 16:06:12.120548 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.120178 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 16:06:12.120749 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.120656 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 16:06:12.122040 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.121951 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 16:06:12.128912 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.128891 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l"] Apr 22 16:06:12.225062 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.225029 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56e03439-958f-4f76-83f9-7b88674b8eb3-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.225062 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.225065 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56e03439-958f-4f76-83f9-7b88674b8eb3-tmp\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.225309 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.225104 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rldx\" (UniqueName: \"kubernetes.io/projected/56e03439-958f-4f76-83f9-7b88674b8eb3-kube-api-access-2rldx\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.326553 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.326516 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56e03439-958f-4f76-83f9-7b88674b8eb3-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.326717 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.326557 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56e03439-958f-4f76-83f9-7b88674b8eb3-tmp\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.326717 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.326586 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rldx\" (UniqueName: \"kubernetes.io/projected/56e03439-958f-4f76-83f9-7b88674b8eb3-kube-api-access-2rldx\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.329102 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.329051 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56e03439-958f-4f76-83f9-7b88674b8eb3-tmp\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.329303 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.329287 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56e03439-958f-4f76-83f9-7b88674b8eb3-tls-certs\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.333873 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.333852 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rldx\" (UniqueName: \"kubernetes.io/projected/56e03439-958f-4f76-83f9-7b88674b8eb3-kube-api-access-2rldx\") pod \"kube-auth-proxy-6c4b9b554-vd75l\" (UID: \"56e03439-958f-4f76-83f9-7b88674b8eb3\") " pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.432194 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.432109 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" Apr 22 16:06:12.563409 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.563382 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l"] Apr 22 16:06:12.566374 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:06:12.566344 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e03439_958f_4f76_83f9_7b88674b8eb3.slice/crio-9645df405c49706a0c6689eecb6d983351aed57d4b123465f2babf60a39e36db WatchSource:0}: Error finding container 9645df405c49706a0c6689eecb6d983351aed57d4b123465f2babf60a39e36db: Status 404 returned error can't find the container with id 9645df405c49706a0c6689eecb6d983351aed57d4b123465f2babf60a39e36db Apr 22 16:06:12.580010 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:12.579983 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" event={"ID":"56e03439-958f-4f76-83f9-7b88674b8eb3","Type":"ContainerStarted","Data":"9645df405c49706a0c6689eecb6d983351aed57d4b123465f2babf60a39e36db"} Apr 22 16:06:16.595990 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:16.595893 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" event={"ID":"56e03439-958f-4f76-83f9-7b88674b8eb3","Type":"ContainerStarted","Data":"8350781be75e35468a24ee8d4d503e045f5a042d341f9c29d65a5c5772463a09"} Apr 22 16:06:16.611076 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:06:16.611011 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6c4b9b554-vd75l" podStartSLOduration=0.916930469 podStartE2EDuration="4.610993536s" podCreationTimestamp="2026-04-22 16:06:12 +0000 UTC" firstStartedPulling="2026-04-22 16:06:12.568251268 +0000 UTC m=+457.003344947" lastFinishedPulling="2026-04-22 16:06:16.262314334 +0000 UTC m=+460.697408014" observedRunningTime="2026-04-22 16:06:16.609828719 +0000 UTC m=+461.044922419" watchObservedRunningTime="2026-04-22 16:06:16.610993536 +0000 UTC m=+461.046087235" Apr 22 16:07:54.116371 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.116290 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25"] Apr 22 16:07:54.118509 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.118489 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.121516 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.121488 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 16:07:54.121516 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.121504 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:07:54.121516 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.121511 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c25fv\"" Apr 22 16:07:54.122374 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.122357 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:07:54.122446 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.122357 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 16:07:54.128404 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.128383 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25"] Apr 22 16:07:54.218850 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.218814 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrjlf\" (UniqueName: \"kubernetes.io/projected/8fb9000a-7c2c-4782-baa4-b61b4cde118a-kube-api-access-mrjlf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.218850 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.218853 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.219110 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.218891 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fb9000a-7c2c-4782-baa4-b61b4cde118a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.320036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.319983 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrjlf\" (UniqueName: \"kubernetes.io/projected/8fb9000a-7c2c-4782-baa4-b61b4cde118a-kube-api-access-mrjlf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.320036 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.320035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.320287 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.320072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fb9000a-7c2c-4782-baa4-b61b4cde118a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.320287 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:07:54.320195 2565 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 16:07:54.320357 ip-10-0-135-152 kubenswrapper[2565]: E0422 16:07:54.320302 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert podName:8fb9000a-7c2c-4782-baa4-b61b4cde118a nodeName:}" failed. No retries permitted until 2026-04-22 16:07:54.820280238 +0000 UTC m=+559.255373915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-mnh25" (UID: "8fb9000a-7c2c-4782-baa4-b61b4cde118a") : secret "plugin-serving-cert" not found Apr 22 16:07:54.320704 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.320687 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fb9000a-7c2c-4782-baa4-b61b4cde118a-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.332465 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.332444 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrjlf\" (UniqueName: \"kubernetes.io/projected/8fb9000a-7c2c-4782-baa4-b61b4cde118a-kube-api-access-mrjlf\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.825498 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.825443 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:54.828183 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:54.828152 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fb9000a-7c2c-4782-baa4-b61b4cde118a-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-mnh25\" (UID: \"8fb9000a-7c2c-4782-baa4-b61b4cde118a\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:55.029241 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:55.029190 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" Apr 22 16:07:55.158599 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:55.158442 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25"] Apr 22 16:07:55.161572 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:07:55.161532 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb9000a_7c2c_4782_baa4_b61b4cde118a.slice/crio-1b2af72deaf79032b1d0a0b82e464052d9bb1e9ac44ea73057a5545052946c6c WatchSource:0}: Error finding container 1b2af72deaf79032b1d0a0b82e464052d9bb1e9ac44ea73057a5545052946c6c: Status 404 returned error can't find the container with id 1b2af72deaf79032b1d0a0b82e464052d9bb1e9ac44ea73057a5545052946c6c Apr 22 16:07:55.926270 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:07:55.926226 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" event={"ID":"8fb9000a-7c2c-4782-baa4-b61b4cde118a","Type":"ContainerStarted","Data":"1b2af72deaf79032b1d0a0b82e464052d9bb1e9ac44ea73057a5545052946c6c"} Apr 22 16:08:21.022248 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:21.022214 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" event={"ID":"8fb9000a-7c2c-4782-baa4-b61b4cde118a","Type":"ContainerStarted","Data":"99bd1f8667491c751c4fb9b710c3e41a8b36e4b26e67f0370c777fe932fce6d7"} Apr 22 16:08:21.038673 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:21.038617 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-mnh25" podStartSLOduration=1.399150328 podStartE2EDuration="27.038601107s" podCreationTimestamp="2026-04-22 16:07:54 +0000 UTC" firstStartedPulling="2026-04-22 16:07:55.163232536 +0000 UTC m=+559.598326211" lastFinishedPulling="2026-04-22 16:08:20.802683302 +0000 UTC m=+585.237776990" observedRunningTime="2026-04-22 16:08:21.036537159 +0000 UTC m=+585.471630855" watchObservedRunningTime="2026-04-22 16:08:21.038601107 +0000 UTC m=+585.473694804" Apr 22 16:08:36.034525 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:36.034498 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:08:36.034981 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:36.034673 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:08:36.037103 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:36.037073 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:08:36.037201 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:36.037168 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:08:49.200218 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.200179 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:08:49.255976 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.255935 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:08:49.255976 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.255972 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:08:49.256218 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.256032 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.258304 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.258284 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 16:08:49.411370 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.411333 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/641e68f0-7902-4b68-a8f4-b47a3a710138-config-file\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.411556 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.411401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24pv\" (UniqueName: \"kubernetes.io/projected/641e68f0-7902-4b68-a8f4-b47a3a710138-kube-api-access-p24pv\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.515958 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.515925 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/641e68f0-7902-4b68-a8f4-b47a3a710138-config-file\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.516150 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.516017 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p24pv\" (UniqueName: \"kubernetes.io/projected/641e68f0-7902-4b68-a8f4-b47a3a710138-kube-api-access-p24pv\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.516572 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.516552 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/641e68f0-7902-4b68-a8f4-b47a3a710138-config-file\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.525151 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.525073 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24pv\" (UniqueName: \"kubernetes.io/projected/641e68f0-7902-4b68-a8f4-b47a3a710138-kube-api-access-p24pv\") pod \"limitador-limitador-78c99df468-6qbwg\" (UID: \"641e68f0-7902-4b68-a8f4-b47a3a710138\") " pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.566735 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.566707 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:49.695803 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:49.695743 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:08:49.698074 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:08:49.698045 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641e68f0_7902_4b68_a8f4_b47a3a710138.slice/crio-bb32b9a15e7364c1522b6f270d04a453565e4f61efa34d8e7a86bde2cb24cd63 WatchSource:0}: Error finding container bb32b9a15e7364c1522b6f270d04a453565e4f61efa34d8e7a86bde2cb24cd63: Status 404 returned error can't find the container with id bb32b9a15e7364c1522b6f270d04a453565e4f61efa34d8e7a86bde2cb24cd63 Apr 22 16:08:50.116776 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:50.116736 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" event={"ID":"641e68f0-7902-4b68-a8f4-b47a3a710138","Type":"ContainerStarted","Data":"bb32b9a15e7364c1522b6f270d04a453565e4f61efa34d8e7a86bde2cb24cd63"} Apr 22 16:08:53.129221 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:53.129135 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" event={"ID":"641e68f0-7902-4b68-a8f4-b47a3a710138","Type":"ContainerStarted","Data":"45d6bad2268959e40c4f5e8bb75c31847afbf93e93c43f81a8061f3e8eeb0b04"} Apr 22 16:08:53.129552 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:53.129360 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:08:53.143995 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:08:53.143952 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" podStartSLOduration=1.00235033 podStartE2EDuration="4.143939143s" podCreationTimestamp="2026-04-22 16:08:49 +0000 UTC" firstStartedPulling="2026-04-22 16:08:49.699990371 +0000 UTC m=+614.135084060" lastFinishedPulling="2026-04-22 16:08:52.841579182 +0000 UTC m=+617.276672873" observedRunningTime="2026-04-22 16:08:53.142736559 +0000 UTC m=+617.577830258" watchObservedRunningTime="2026-04-22 16:08:53.143939143 +0000 UTC m=+617.579032840" Apr 22 16:09:04.133203 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:09:04.133175 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-6qbwg" Apr 22 16:09:28.369768 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:09:28.369733 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:07.886028 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:07.885988 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:14.874312 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:14.874273 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:20.854057 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.854023 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp"] Apr 22 16:10:20.857545 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.857513 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:20.861419 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.861369 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 16:10:20.861733 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.861564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-k94wr\"" Apr 22 16:10:20.861847 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.861593 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 16:10:20.861847 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.861625 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 16:10:20.866063 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.866010 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp"] Apr 22 16:10:20.983207 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:20.983170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:21.006420 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006390 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.006544 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006434 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.006544 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006476 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.006544 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006500 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpfvw\" (UniqueName: \"kubernetes.io/projected/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kube-api-access-wpfvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.006655 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.006655 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.006611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.107936 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.107840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.107936 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.107882 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpfvw\" (UniqueName: \"kubernetes.io/projected/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kube-api-access-wpfvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108216 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.107943 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108216 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.107965 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108216 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.108027 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108379 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.108207 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108379 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.108333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108455 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.108404 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.108678 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.108658 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.110929 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.110903 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.111080 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.111063 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.116491 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.116470 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpfvw\" (UniqueName: \"kubernetes.io/projected/38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9-kube-api-access-wpfvw\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp\" (UID: \"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.169610 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.169582 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:21.293467 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.293435 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp"] Apr 22 16:10:21.297232 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:10:21.297195 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38de9878_dcd3_4c1b_8d58_2c8e4d2f61b9.slice/crio-04c2e5cbda63a5a453ed05b6e096170ca278814e8c6899599f55a6177de6d7ca WatchSource:0}: Error finding container 04c2e5cbda63a5a453ed05b6e096170ca278814e8c6899599f55a6177de6d7ca: Status 404 returned error can't find the container with id 04c2e5cbda63a5a453ed05b6e096170ca278814e8c6899599f55a6177de6d7ca Apr 22 16:10:21.298934 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.298915 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:10:21.414069 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:21.413984 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" event={"ID":"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9","Type":"ContainerStarted","Data":"04c2e5cbda63a5a453ed05b6e096170ca278814e8c6899599f55a6177de6d7ca"} Apr 22 16:10:23.878175 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:23.878055 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:27.441528 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:27.441486 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" event={"ID":"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9","Type":"ContainerStarted","Data":"6d8bf75b599a01c753123157c72d7c8bfdeb139209fb290eef2252702464682d"} Apr 22 16:10:32.459313 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:32.459279 2565 generic.go:358] "Generic (PLEG): container finished" podID="38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9" containerID="6d8bf75b599a01c753123157c72d7c8bfdeb139209fb290eef2252702464682d" exitCode=0 Apr 22 16:10:32.459753 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:32.459350 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" event={"ID":"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9","Type":"ContainerDied","Data":"6d8bf75b599a01c753123157c72d7c8bfdeb139209fb290eef2252702464682d"} Apr 22 16:10:34.469725 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:34.469688 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" event={"ID":"38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9","Type":"ContainerStarted","Data":"bc639b0f288288aed9468508b1911578aa33eb7a4e26528185da2786115cf122"} Apr 22 16:10:34.470157 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:34.469975 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:34.487098 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:34.487035 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" podStartSLOduration=2.293950943 podStartE2EDuration="14.487021672s" podCreationTimestamp="2026-04-22 16:10:20 +0000 UTC" firstStartedPulling="2026-04-22 16:10:21.299037323 +0000 UTC m=+705.734130999" lastFinishedPulling="2026-04-22 16:10:33.492108049 +0000 UTC m=+717.927201728" observedRunningTime="2026-04-22 16:10:34.485763639 +0000 UTC m=+718.920857371" watchObservedRunningTime="2026-04-22 16:10:34.487021672 +0000 UTC m=+718.922115369" Apr 22 16:10:45.485553 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:45.485518 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp" Apr 22 16:10:48.576682 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:48.576646 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:10:57.949177 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:57.949137 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr"] Apr 22 16:10:57.951491 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:57.951473 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:57.953617 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:57.953594 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 16:10:57.962537 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:57.962505 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr"] Apr 22 16:10:58.040659 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040627 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.040659 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.040897 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040706 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db07fab6-35dc-43e6-babd-6091b5e1b2ac-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.040897 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040742 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9qk\" (UniqueName: \"kubernetes.io/projected/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kube-api-access-zr9qk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.040897 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040769 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.040897 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.040794 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141201 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141201 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141204 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141451 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141241 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db07fab6-35dc-43e6-babd-6091b5e1b2ac-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141451 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9qk\" (UniqueName: \"kubernetes.io/projected/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kube-api-access-zr9qk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141451 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141311 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141451 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141657 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141621 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141703 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141625 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.141741 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.141723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.143820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.143795 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/db07fab6-35dc-43e6-babd-6091b5e1b2ac-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.143930 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.143914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/db07fab6-35dc-43e6-babd-6091b5e1b2ac-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.148382 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.148364 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9qk\" (UniqueName: \"kubernetes.io/projected/db07fab6-35dc-43e6-babd-6091b5e1b2ac-kube-api-access-zr9qk\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr\" (UID: \"db07fab6-35dc-43e6-babd-6091b5e1b2ac\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.260544 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.260504 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:10:58.383438 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.383414 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr"] Apr 22 16:10:58.385457 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:10:58.385420 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb07fab6_35dc_43e6_babd_6091b5e1b2ac.slice/crio-99e7531601375995aa52ef70a1fd06afce3ce367d3a5caea9a614db50deb0c50 WatchSource:0}: Error finding container 99e7531601375995aa52ef70a1fd06afce3ce367d3a5caea9a614db50deb0c50: Status 404 returned error can't find the container with id 99e7531601375995aa52ef70a1fd06afce3ce367d3a5caea9a614db50deb0c50 Apr 22 16:10:58.553276 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.553180 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" event={"ID":"db07fab6-35dc-43e6-babd-6091b5e1b2ac","Type":"ContainerStarted","Data":"d9aef6ba7e61aa1574c1c79333a5ca2dedf948995ded730f570cfc8ee1b801be"} Apr 22 16:10:58.553276 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:58.553230 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" event={"ID":"db07fab6-35dc-43e6-babd-6091b5e1b2ac","Type":"ContainerStarted","Data":"99e7531601375995aa52ef70a1fd06afce3ce367d3a5caea9a614db50deb0c50"} Apr 22 16:10:59.575484 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:10:59.575451 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6qbwg"] Apr 22 16:11:06.585359 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:06.585327 2565 generic.go:358] "Generic (PLEG): container finished" podID="db07fab6-35dc-43e6-babd-6091b5e1b2ac" containerID="d9aef6ba7e61aa1574c1c79333a5ca2dedf948995ded730f570cfc8ee1b801be" exitCode=0 Apr 22 16:11:06.585777 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:06.585386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" event={"ID":"db07fab6-35dc-43e6-babd-6091b5e1b2ac","Type":"ContainerDied","Data":"d9aef6ba7e61aa1574c1c79333a5ca2dedf948995ded730f570cfc8ee1b801be"} Apr 22 16:11:07.593125 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:07.593070 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" event={"ID":"db07fab6-35dc-43e6-babd-6091b5e1b2ac","Type":"ContainerStarted","Data":"1f8e2519b344b25d0585470ebaffc657382e0c46c3f7fffff803cd13ff2c4e8c"} Apr 22 16:11:07.593488 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:07.593296 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:11:07.610019 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:07.609972 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" podStartSLOduration=10.35396885 podStartE2EDuration="10.609958129s" podCreationTimestamp="2026-04-22 16:10:57 +0000 UTC" firstStartedPulling="2026-04-22 16:11:06.58599898 +0000 UTC m=+751.021092659" lastFinishedPulling="2026-04-22 16:11:06.841988259 +0000 UTC m=+751.277081938" observedRunningTime="2026-04-22 16:11:07.608005883 +0000 UTC m=+752.043099580" watchObservedRunningTime="2026-04-22 16:11:07.609958129 +0000 UTC m=+752.045051825" Apr 22 16:11:18.609483 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:11:18.609453 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr" Apr 22 16:13:36.058714 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:36.058636 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:13:36.060378 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:36.060352 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:13:36.061617 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:36.061598 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:13:36.063273 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:36.063252 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:13:36.835421 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:36.835385 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-sjklr_8196a26a-5eed-4067-9591-67935be9b123/manager/0.log" Apr 22 16:13:38.470253 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:38.470225 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-mnh25_8fb9000a-7c2c-4782-baa4-b61b4cde118a/kuadrant-console-plugin/0.log" Apr 22 16:13:38.791060 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:38.791036 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6qbwg_641e68f0-7902-4b68-a8f4-b47a3a710138/limitador/0.log" Apr 22 16:13:39.526341 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:39.526305 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c4b9b554-vd75l_56e03439-958f-4f76-83f9-7b88674b8eb3/kube-auth-proxy/0.log" Apr 22 16:13:40.521368 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:40.521337 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp_38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9/main/0.log" Apr 22 16:13:40.527286 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:40.527245 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-6rgmp_38de9878-dcd3-4c1b-8d58-2c8e4d2f61b9/storage-initializer/0.log" Apr 22 16:13:40.629019 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:40.628971 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr_db07fab6-35dc-43e6-babd-6091b5e1b2ac/storage-initializer/0.log" Apr 22 16:13:40.636607 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:40.636583 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-ljcsr_db07fab6-35dc-43e6-babd-6091b5e1b2ac/main/0.log" Apr 22 16:13:47.150958 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:47.150926 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h264g_8b2db19d-176e-4219-830f-a3b6ed5a34e0/global-pull-secret-syncer/0.log" Apr 22 16:13:47.332353 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:47.332321 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rlxkj_590a23f3-74f3-406f-a0f0-bf1db0f7b0a0/konnectivity-agent/0.log" Apr 22 16:13:47.377074 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:47.377042 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-152.ec2.internal_04b47764dfb923467fa0be6032d47f9e/haproxy/0.log" Apr 22 16:13:52.007686 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:52.007597 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-mnh25_8fb9000a-7c2c-4782-baa4-b61b4cde118a/kuadrant-console-plugin/0.log" Apr 22 16:13:52.118169 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:52.118145 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6qbwg_641e68f0-7902-4b68-a8f4-b47a3a710138/limitador/0.log" Apr 22 16:13:53.832504 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:53.832470 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-74dfc75f77-4s8jm_64aca963-edf8-457f-990b-6fd0e03348b1/metrics-server/0.log" Apr 22 16:13:53.919075 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:53.919046 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctlfq_7e9363b5-12aa-4223-bf69-f67620bf66d7/node-exporter/0.log" Apr 22 16:13:53.944858 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:53.944827 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctlfq_7e9363b5-12aa-4223-bf69-f67620bf66d7/kube-rbac-proxy/0.log" Apr 22 16:13:53.969322 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:53.969295 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ctlfq_7e9363b5-12aa-4223-bf69-f67620bf66d7/init-textfile/0.log" Apr 22 16:13:54.159117 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.159026 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t89m5_2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca/kube-rbac-proxy-main/0.log" Apr 22 16:13:54.181730 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.181708 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t89m5_2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca/kube-rbac-proxy-self/0.log" Apr 22 16:13:54.203067 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.203040 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-t89m5_2b3da36b-72fc-4c43-98f5-6cfe4e13c7ca/openshift-state-metrics/0.log" Apr 22 16:13:54.240470 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.240440 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/prometheus/0.log" Apr 22 16:13:54.261785 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.261748 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/config-reloader/0.log" Apr 22 16:13:54.290370 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.290331 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/thanos-sidecar/0.log" Apr 22 16:13:54.320220 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.320192 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/kube-rbac-proxy-web/0.log" Apr 22 16:13:54.343139 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.343116 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/kube-rbac-proxy/0.log" Apr 22 16:13:54.367985 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.367960 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/kube-rbac-proxy-thanos/0.log" Apr 22 16:13:54.391194 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.391148 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8629a6c0-975f-44bb-946e-c5676e4bf5ed/init-config-reloader/0.log" Apr 22 16:13:54.419944 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.419866 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhtgt_ebef7e67-a38a-414a-88b6-ee4a94e326fe/prometheus-operator/0.log" Apr 22 16:13:54.438543 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.438507 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dhtgt_ebef7e67-a38a-414a-88b6-ee4a94e326fe/kube-rbac-proxy/0.log" Apr 22 16:13:54.464673 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.464642 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hdfnk_d7a23691-2673-40b8-ae6c-5af9e81bc2d4/prometheus-operator-admission-webhook/0.log" Apr 22 16:13:54.496077 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.496049 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997968445-m59rp_3b837bd3-a08f-465c-a109-a195a45ffccb/telemeter-client/0.log" Apr 22 16:13:54.527568 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.527538 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997968445-m59rp_3b837bd3-a08f-465c-a109-a195a45ffccb/reload/0.log" Apr 22 16:13:54.549567 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.549533 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5997968445-m59rp_3b837bd3-a08f-465c-a109-a195a45ffccb/kube-rbac-proxy/0.log" Apr 22 16:13:54.575868 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.575830 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/thanos-query/0.log" Apr 22 16:13:54.596704 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.596678 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/kube-rbac-proxy-web/0.log" Apr 22 16:13:54.617287 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.617261 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/kube-rbac-proxy/0.log" Apr 22 16:13:54.637832 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.637806 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/prom-label-proxy/0.log" Apr 22 16:13:54.660472 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.660446 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/kube-rbac-proxy-rules/0.log" Apr 22 16:13:54.681061 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:54.680987 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-695bbcdf7d-5tzxb_00a88686-6ce6-4405-bd51-ea862bbae204/kube-rbac-proxy-metrics/0.log" Apr 22 16:13:55.751820 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:55.751797 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zk6jf_033e308b-fe58-4cc9-88b6-16bdf7f37e5b/networking-console-plugin/0.log" Apr 22 16:13:56.015358 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.015327 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9"] Apr 22 16:13:56.019229 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.019209 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.021542 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.021514 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"kube-root-ca.crt\"" Apr 22 16:13:56.022436 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.022380 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-b5p7f\"/\"default-dockercfg-bfrf8\"" Apr 22 16:13:56.022436 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.022400 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-b5p7f\"/\"openshift-service-ca.crt\"" Apr 22 16:13:56.025756 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.025733 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9"] Apr 22 16:13:56.068181 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.068144 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-lib-modules\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.068181 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.068179 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-podres\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.068392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.068215 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5sfr\" (UniqueName: \"kubernetes.io/projected/3e15ffb6-09e1-40ff-a859-07c9ef589832-kube-api-access-s5sfr\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.068392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.068298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-sys\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.068392 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.068329 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-proc\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169385 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-sys\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169385 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169389 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-proc\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169439 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-lib-modules\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169463 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-sys\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169471 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-podres\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169543 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5sfr\" (UniqueName: \"kubernetes.io/projected/3e15ffb6-09e1-40ff-a859-07c9ef589832-kube-api-access-s5sfr\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169555 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-proc\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-podres\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.169639 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.169593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e15ffb6-09e1-40ff-a859-07c9ef589832-lib-modules\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.176802 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.176778 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5sfr\" (UniqueName: \"kubernetes.io/projected/3e15ffb6-09e1-40ff-a859-07c9ef589832-kube-api-access-s5sfr\") pod \"perf-node-gather-daemonset-tvjn9\" (UID: \"3e15ffb6-09e1-40ff-a859-07c9ef589832\") " pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.309208 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.309120 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/2.log" Apr 22 16:13:56.316833 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.316811 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nrtqr_72f4e084-b355-494f-955a-9d9d02e32cdb/console-operator/3.log" Apr 22 16:13:56.330921 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.330892 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:56.451510 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:56.451407 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9"] Apr 22 16:13:56.454206 ip-10-0-135-152 kubenswrapper[2565]: W0422 16:13:56.454177 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e15ffb6_09e1_40ff_a859_07c9ef589832.slice/crio-facaa4e0440a2bf9505940acaa2b26d94606be3ea776111c26939563db0c5fb1 WatchSource:0}: Error finding container facaa4e0440a2bf9505940acaa2b26d94606be3ea776111c26939563db0c5fb1: Status 404 returned error can't find the container with id facaa4e0440a2bf9505940acaa2b26d94606be3ea776111c26939563db0c5fb1 Apr 22 16:13:57.155706 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:57.155669 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" event={"ID":"3e15ffb6-09e1-40ff-a859-07c9ef589832","Type":"ContainerStarted","Data":"07460bfc75ba629a6ee5ed138669b59ff805b289afcdc4b484af019eb994719c"} Apr 22 16:13:57.155706 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:57.155706 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" event={"ID":"3e15ffb6-09e1-40ff-a859-07c9ef589832","Type":"ContainerStarted","Data":"facaa4e0440a2bf9505940acaa2b26d94606be3ea776111c26939563db0c5fb1"} Apr 22 16:13:57.156159 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:57.155797 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:13:57.168854 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:57.168806 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" podStartSLOduration=1.168793936 podStartE2EDuration="1.168793936s" podCreationTimestamp="2026-04-22 16:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:13:57.168209123 +0000 UTC m=+921.603302844" watchObservedRunningTime="2026-04-22 16:13:57.168793936 +0000 UTC m=+921.603887632" Apr 22 16:13:58.133276 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:58.133237 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hx8z7_83a80857-8a6e-454e-be2b-ecc561993b6d/dns/0.log" Apr 22 16:13:58.153827 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:58.153790 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hx8z7_83a80857-8a6e-454e-be2b-ecc561993b6d/kube-rbac-proxy/0.log" Apr 22 16:13:58.293758 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:58.293726 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k6xqz_a8967a65-4b16-4099-9db4-ce8642ba6138/dns-node-resolver/0.log" Apr 22 16:13:58.836524 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:58.836489 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9cj4f_2b5899ec-33ba-45f8-b259-82f0af9723a4/node-ca/0.log" Apr 22 16:13:59.791235 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:13:59.791201 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6c4b9b554-vd75l_56e03439-958f-4f76-83f9-7b88674b8eb3/kube-auth-proxy/0.log" Apr 22 16:14:00.422583 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:00.422551 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6s4ph_1480b1d8-29b2-4b9f-9d5a-04a492daadb0/serve-healthcheck-canary/0.log" Apr 22 16:14:00.948856 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:00.948823 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24bnd_11ec3c8d-9449-4a8d-ae4f-431815ccfd6f/kube-rbac-proxy/0.log" Apr 22 16:14:00.970569 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:00.970542 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24bnd_11ec3c8d-9449-4a8d-ae4f-431815ccfd6f/exporter/0.log" Apr 22 16:14:00.991633 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:00.991603 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24bnd_11ec3c8d-9449-4a8d-ae4f-431815ccfd6f/extractor/0.log" Apr 22 16:14:03.081823 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:03.081789 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54dfb4598d-sjklr_8196a26a-5eed-4067-9591-67935be9b123/manager/0.log" Apr 22 16:14:03.169571 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:03.169544 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-b5p7f/perf-node-gather-daemonset-tvjn9" Apr 22 16:14:04.203323 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:04.203294 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7d868c4d86-r8z45_81c02492-94cc-4b38-9784-c30031e4d287/manager/0.log" Apr 22 16:14:10.029597 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.029566 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4j4vm_d9391598-fc74-406d-ad2b-087fbbe59063/kube-multus/0.log" Apr 22 16:14:10.213595 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.213566 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/kube-multus-additional-cni-plugins/0.log" Apr 22 16:14:10.236677 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.236652 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/egress-router-binary-copy/0.log" Apr 22 16:14:10.258838 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.258810 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/cni-plugins/0.log" Apr 22 16:14:10.281007 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.280947 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/bond-cni-plugin/0.log" Apr 22 16:14:10.301713 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.301694 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/routeoverride-cni/0.log" Apr 22 16:14:10.324268 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.324249 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/whereabouts-cni-bincopy/0.log" Apr 22 16:14:10.345924 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.345904 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c7hk6_98755c75-9268-4cfa-8cae-e8ccf20974be/whereabouts-cni/0.log" Apr 22 16:14:10.591749 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.591667 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2nbv7_2cd47a51-d8a9-48f4-bf8e-d11d89cead22/network-metrics-daemon/0.log" Apr 22 16:14:10.610017 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:10.609983 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2nbv7_2cd47a51-d8a9-48f4-bf8e-d11d89cead22/kube-rbac-proxy/0.log" Apr 22 16:14:11.497640 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.497605 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-controller/0.log" Apr 22 16:14:11.514506 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.514476 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/0.log" Apr 22 16:14:11.523421 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.523397 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovn-acl-logging/1.log" Apr 22 16:14:11.542357 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.542323 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/kube-rbac-proxy-node/0.log" Apr 22 16:14:11.563526 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.563507 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:14:11.582920 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.582899 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/northd/0.log" Apr 22 16:14:11.603566 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.603545 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/nbdb/0.log" Apr 22 16:14:11.625213 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.625194 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/sbdb/0.log" Apr 22 16:14:11.785113 ip-10-0-135-152 kubenswrapper[2565]: I0422 16:14:11.785058 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-95pm2_4775e631-4da8-45cc-9fb4-6238451abe84/ovnkube-controller/0.log"