Apr 16 18:15:02.962043 ip-10-0-138-88 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:15:02.962073 ip-10-0-138-88 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:15:02.962082 ip-10-0-138-88 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:15:02.962385 ip-10-0-138-88 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:15:13.127063 ip-10-0-138-88 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:15:13.127090 ip-10-0-138-88 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b6e17d31537b4814aa79160ee8aba2fb -- Apr 16 18:17:36.981691 ip-10-0-138-88 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:37.407844 ip-10-0-138-88 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:37.407844 ip-10-0-138-88 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:37.407844 ip-10-0-138-88 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:37.407844 ip-10-0-138-88 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:37.407844 ip-10-0-138-88 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:37.411884 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.411779 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416222 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416245 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416249 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416253 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416256 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416260 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416263 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:37.416253 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416268 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416271 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416274 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416277 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416281 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416283 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416286 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416289 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416292 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416295 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416297 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416300 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416303 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416306 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416309 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416311 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416315 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416317 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416320 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416323 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:37.416564 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416325 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416328 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416330 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416334 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416336 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416339 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416342 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416344 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416347 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416349 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416352 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416355 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416357 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416360 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416364 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416367 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416373 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416377 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416380 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:37.417048 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416383 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416386 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416389 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416404 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416407 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416410 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416413 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416416 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416420 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416423 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416426 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416429 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416432 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416434 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416437 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416440 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416442 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416445 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416448 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:37.417565 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416450 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416453 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416456 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416458 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416461 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416464 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416466 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416469 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416472 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416476 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416480 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416483 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416486 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416492 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416495 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416498 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416501 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416504 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416506 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416509 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:37.418021 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416512 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416920 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416926 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416929 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416932 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416935 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416937 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416940 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416943 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416946 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416948 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416951 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416954 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416957 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416960 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416962 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416965 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416968 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416970 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416973 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:37.418560 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416977 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416980 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416982 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416985 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416988 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416991 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416994 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416997 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.416999 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417002 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417004 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417007 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417009 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417012 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417016 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417019 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417022 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417025 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417027 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:37.419040 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417030 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417033 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417036 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417038 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417041 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417044 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417046 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417049 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417052 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417055 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417059 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417063 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417066 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417070 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417072 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417075 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417078 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417080 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417083 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:37.419623 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417086 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417089 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417091 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417095 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417097 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417100 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417103 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417106 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417108 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417111 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417117 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417119 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417122 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417124 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417127 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417130 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417133 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417136 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417138 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417141 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:37.420088 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417143 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417146 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417148 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417151 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417154 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417156 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417159 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417161 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.417164 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418254 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418264 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418270 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418275 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418281 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418284 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418289 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418293 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418297 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418300 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418303 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418307 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418310 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:37.420584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418315 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418318 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418321 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418324 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418327 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418330 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418335 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418338 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418342 2579 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418344 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418348 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418352 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418355 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418358 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418362 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418365 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418368 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418372 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418375 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418378 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418388 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418402 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418406 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418408 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418412 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:37.421134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418416 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418420 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418423 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418427 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418430 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418433 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418438 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418443 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418446 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418449 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418452 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418455 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418458 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418461 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418464 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418467 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418470 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418474 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418477 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418481 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418484 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418487 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418490 2579 flags.go:64] FLAG: --help="false" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418493 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418496 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:37.421767 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418499 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418502 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418506 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418509 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418512 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418515 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418518 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418521 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418524 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418527 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418530 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418533 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418536 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418539 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418543 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418546 2579 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418549 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418553 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418556 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418562 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418565 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418568 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418571 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:37.422369 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418574 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418578 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418581 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418584 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418588 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418591 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418596 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418599 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418602 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418605 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418608 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418612 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418615 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418618 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418626 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418629 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418632 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418636 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418639 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418645 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418648 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418651 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418654 2579 flags.go:64] FLAG: --port="10250" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418658 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:37.422934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418661 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-059263f36f985391a" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418664 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418667 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418670 2579 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418673 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418677 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418683 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418686 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418689 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418691 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418696 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418699 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418701 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418704 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418707 2579 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418710 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418713 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418716 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418719 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418722 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418725 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418728 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418731 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418734 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418737 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418740 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:37.423573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418744 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418747 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418750 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418753 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418759 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418762 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418765 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418770 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418773 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418775 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418778 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418781 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418784 2579 flags.go:64] FLAG: --v="2" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418789 2579 flags.go:64] FLAG: --version="false" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418793 2579 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418798 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.418801 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418897 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418900 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418903 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418906 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418909 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418912 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:37.424207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418915 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418919 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418922 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418924 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418927 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418929 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418932 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418934 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418937 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418940 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418944 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418947 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418949 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418952 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418955 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418958 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418961 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418964 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418967 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:37.424771 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418971 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418975 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418978 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418981 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418984 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418987 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418990 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418993 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418996 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.418998 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419001 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419004 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419006 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419009 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419011 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419014 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419016 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419019 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419022 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419024 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:37.425274 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419027 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419029 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419032 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419034 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419037 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419040 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419043 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419049 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419051 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419054 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419057 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419060 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419062 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419065 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419067 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419070 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419072 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419075 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419079 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419083 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:37.425802 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419086 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419088 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419091 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419093 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419096 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419099 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419101 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419104 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419107 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419109 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419112 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419114 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419118 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419121 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419123 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419126 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419129 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419132 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419134 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419138 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:37.426304 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.419141 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:37.426847 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.419945 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:37.427143 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.427118 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:37.427173 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.427146 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:37.427207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427198 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:37.427207 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427207 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427210 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427214 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427217 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427220 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427222 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427225 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427229 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427232 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427234 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427237 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427240 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427242 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427245 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427248 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427250 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427253 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427256 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427258 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427261 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:37.427260 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427264 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427266 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427269 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427272 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427275 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427278 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427281 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427284 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427287 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427289 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427293 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427295 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427298 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427300 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427303 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427306 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427308 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427311 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427313 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427316 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:37.427780 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427319 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427321 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427324 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427328 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427331 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427334 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427336 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427339 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427342 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427344 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427347 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427350 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427352 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427355 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427358 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427361 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427364 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427367 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427370 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427373 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:37.428276 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427375 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427378 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427380 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427384 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427386 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427389 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427409 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427413 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427415 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427418 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427421 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427425 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427431 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427434 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427437 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427440 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427443 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427446 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427449 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427452 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:37.428853 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427455 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427457 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427460 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427463 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427467 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.427473 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427578 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427585 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427590 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427593 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427596 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427599 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427603 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427606 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427609 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:37.429350 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427612 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427616 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427619 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427622 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427624 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427627 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427630 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427633 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427636 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427639 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427642 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427645 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427647 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427650 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427653 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427655 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427658 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427661 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427663 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:37.429734 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427666 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427669 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427672 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427674 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427678 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427681 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427683 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427686 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427688 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427691 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427694 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427697 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427699 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427702 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427704 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427708 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427710 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427713 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427715 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427718 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427721 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:37.430200 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427723 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427726 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427728 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427731 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427733 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427736 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427739 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427741 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427744 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427746 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427749 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427752 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427754 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427757 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427759 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427762 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427765 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427768 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427770 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:37.430712 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427773 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427775 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427778 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427781 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427783 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427786 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427789 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427792 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427795 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427798 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427801 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427803 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427806 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427808 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427811 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427813 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427816 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:37.431189 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:37.427819 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:37.431647 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.427824 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:37.431647 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.428540 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:37.431647 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.431283 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:37.432208 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.432194 2579 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:37.432304 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.432290 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:37.432338 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.432327 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:37.458543 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.458517 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:37.461374 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.461333 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:37.478960 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.478934 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:37.485619 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.485596 2579 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:37.486488 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.486464 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:37.486886 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.486870 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:37.491599 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.491575 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b184a91f-7cf4-4027-8a21-2582f21ee0fe:/dev/nvme0n1p3 ee1fdbcf-ecf2-41ba-9d06-4adf58b4b22b:/dev/nvme0n1p4] Apr 16 18:17:37.491681 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.491597 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:37.496539 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.496414 2579 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:37.49533629 +0000 UTC m=+0.402916310 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100205 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2afcc7103162db26b68109d17603fc SystemUUID:ec2afcc7-1031-62db-26b6-8109d17603fc BootID:b6e17d31-537b-4814-aa79-160ee8aba2fb Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:14:26:d3:c1:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:14:26:d3:c1:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:87:69:cf:9b:5b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:37.496539 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.496531 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:37.496666 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.496622 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:37.497697 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497668 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:37.497882 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497700 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-88.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:37.497929 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497892 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:37.497929 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497902 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:37.497929 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497920 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:37.498013 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.497935 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:37.499304 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.499291 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:37.499435 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.499425 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:37.502636 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.502623 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:37.502689 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.502641 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:37.503212 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.503204 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:37.503245 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.503217 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:37.503245 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.503228 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:37.504224 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.504212 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:37.504269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.504231 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:37.507218 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.507193 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:37.507873 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.507855 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wnnvq" Apr 16 18:17:37.508857 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.508843 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:37.510176 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510162 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510180 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510186 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510192 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510198 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510204 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510210 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510216 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510223 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:37.510231 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510232 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:37.510495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510243 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:37.510495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510252 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:37.510965 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510955 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:37.510965 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.510964 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:37.514909 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.514883 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:37.515040 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.515023 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-88.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:37.515096 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.515051 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:37.515591 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.515577 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:37.515641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.515626 2579 server.go:1295] "Started kubelet" Apr 16 18:17:37.515760 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.515728 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:37.516201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.516152 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:37.516310 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.516212 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:37.516511 ip-10-0-138-88 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:37.517382 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.517368 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:37.517851 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.517834 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wnnvq" Apr 16 18:17:37.518459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.518443 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:37.520710 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.519885 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-88.ec2.internal.18a6e9272d339900 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-88.ec2.internal,UID:ip-10-0-138-88.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-88.ec2.internal,},FirstTimestamp:2026-04-16 18:17:37.515591936 +0000 UTC m=+0.423171955,LastTimestamp:2026-04-16 18:17:37.515591936 +0000 UTC m=+0.423171955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-88.ec2.internal,}" Apr 16 18:17:37.522853 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.522830 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:37.523285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.523267 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:37.524083 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524066 2579 factory.go:55] Registering systemd factory Apr 16 18:17:37.524083 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524085 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:37.524224 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.524068 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:37.524224 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524100 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:37.524224 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524100 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:37.524224 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524118 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:37.524224 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.524069 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:37.524423 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524227 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:37.524423 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524236 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:37.524423 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524357 2579 factory.go:153] Registering CRI-O factory Apr 16 18:17:37.524423 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524366 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:37.524558 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524431 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:37.524558 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524452 2579 factory.go:103] Registering Raw factory Apr 16 18:17:37.524558 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.524463 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:37.525108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.525094 2579 manager.go:319] Starting recovery of all containers Apr 16 18:17:37.533948 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.533702 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:37.536577 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.536554 2579 manager.go:324] Recovery completed Apr 16 18:17:37.537810 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.537787 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-88.ec2.internal\" not found" node="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.538164 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.538147 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:17:37.541268 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.541255 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.544221 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544204 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.544291 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544234 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.544291 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544245 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.544776 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544764 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:37.544776 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544775 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:37.544894 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.544799 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:37.547197 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.547185 2579 policy_none.go:49] "None policy: Start" Apr 16 18:17:37.547249 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.547201 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:37.547249 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.547211 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.586604 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.586644 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.586658 2579 server.go:85] "Starting device plugin registration server" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.586944 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.586959 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.587034 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.587119 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.587128 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.587747 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:37.597877 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.587786 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:37.662728 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.662635 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:37.663905 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.663879 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:37.664004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.663914 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:37.664004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.663940 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:37.664004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.663951 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:37.664004 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.663993 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:37.666972 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.666954 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:37.687946 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.687918 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.688928 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.688914 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.689022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.688947 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.689022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.688962 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.689022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.688993 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.702824 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.702803 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.702917 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.702829 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-88.ec2.internal\": node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:37.727614 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.727586 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:37.764841 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.764798 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal"] Apr 16 18:17:37.764925 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.764915 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.766476 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.766460 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.766543 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.766492 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.766543 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.766502 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.767693 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.767679 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.767856 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.767840 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.767918 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.767878 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.769504 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769488 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.769504 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769497 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.769642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769525 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.769642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769539 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.769642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769547 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.769642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.769565 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.771267 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.771250 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.771350 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.771281 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:37.772097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.772077 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:37.772198 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.772111 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:37.772198 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.772126 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:37.795158 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.795127 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-88.ec2.internal\" not found" node="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.799603 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.799586 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-88.ec2.internal\" not found" node="ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.825708 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.825677 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.825846 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.825713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.825846 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.825744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b69b580532cc9a5ba4501f776b0e392e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-88.ec2.internal\" (UID: \"b69b580532cc9a5ba4501f776b0e392e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.827930 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.827910 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:37.926848 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.926848 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.926989 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.926989 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926896 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b69b580532cc9a5ba4501f776b0e392e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-88.ec2.internal\" (UID: \"b69b580532cc9a5ba4501f776b0e392e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.926989 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b69b580532cc9a5ba4501f776b0e392e-config\") pod \"kube-apiserver-proxy-ip-10-0-138-88.ec2.internal\" (UID: \"b69b580532cc9a5ba4501f776b0e392e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.926989 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:37.926957 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/66311dffd58d8401224540c8f2917b24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal\" (UID: \"66311dffd58d8401224540c8f2917b24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:37.928011 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:37.927997 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.028113 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.028078 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.098319 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.098293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:38.102891 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.102873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:38.128686 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.128654 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.229274 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.229179 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.329627 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.329598 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.430054 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.430023 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.432183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.432164 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:38.432328 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.432312 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:38.432382 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.432338 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:38.520590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.520542 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:37 +0000 UTC" deadline="2028-01-23 08:46:37.689544136 +0000 UTC" Apr 16 18:17:38.520590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.520580 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15518h28m59.168967197s" Apr 16 18:17:38.523666 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.523647 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:38.530847 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.530819 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.536356 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.536334 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:38.557844 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.557818 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8tndw" Apr 16 18:17:38.566347 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.566327 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8tndw" Apr 16 18:17:38.624435 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:38.624379 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb69b580532cc9a5ba4501f776b0e392e.slice/crio-46c8e1a9fc648fdbc34b127b9d49213d351238018ac59fdb6cc3e668af58b949 WatchSource:0}: Error finding container 46c8e1a9fc648fdbc34b127b9d49213d351238018ac59fdb6cc3e668af58b949: Status 404 returned error can't find the container with id 46c8e1a9fc648fdbc34b127b9d49213d351238018ac59fdb6cc3e668af58b949 Apr 16 18:17:38.625379 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:38.625357 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66311dffd58d8401224540c8f2917b24.slice/crio-2ea8b558d80e167502a51bd6577bb904ed09bf762608e1cad85e9c7b10f58e62 WatchSource:0}: Error finding container 2ea8b558d80e167502a51bd6577bb904ed09bf762608e1cad85e9c7b10f58e62: Status 404 returned error can't find the container with id 2ea8b558d80e167502a51bd6577bb904ed09bf762608e1cad85e9c7b10f58e62 Apr 16 18:17:38.629290 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.629274 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:38.631889 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.631872 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.660452 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.660416 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:38.667288 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.667247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" event={"ID":"66311dffd58d8401224540c8f2917b24","Type":"ContainerStarted","Data":"2ea8b558d80e167502a51bd6577bb904ed09bf762608e1cad85e9c7b10f58e62"} Apr 16 18:17:38.668111 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.668087 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" event={"ID":"b69b580532cc9a5ba4501f776b0e392e","Type":"ContainerStarted","Data":"46c8e1a9fc648fdbc34b127b9d49213d351238018ac59fdb6cc3e668af58b949"} Apr 16 18:17:38.732594 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:38.732559 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-88.ec2.internal\" not found" Apr 16 18:17:38.760061 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.759996 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:38.824254 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.824206 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" Apr 16 18:17:38.836681 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.836657 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:38.837516 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.837502 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" Apr 16 18:17:38.847210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:38.847189 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:39.503813 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.503778 2579 apiserver.go:52] "Watching apiserver" Apr 16 18:17:39.511965 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.511938 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:39.513723 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.513694 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5gncp","openshift-multus/multus-additional-cni-plugins-kx5hq","openshift-multus/network-metrics-daemon-h2jgj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf","openshift-cluster-node-tuning-operator/tuned-dkclb","openshift-dns/node-resolver-sfjlw","openshift-image-registry/node-ca-g2bjm","openshift-network-diagnostics/network-check-target-8wxmz","openshift-network-operator/iptables-alerter-lr9r2","openshift-ovn-kubernetes/ovnkube-node-zkthf","kube-system/konnectivity-agent-z5qzj","kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal"] Apr 16 18:17:39.516780 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.516758 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.518959 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.518907 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.519099 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.519015 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.519180 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.519142 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:39.519288 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.519268 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rf6nn\"" Apr 16 18:17:39.519388 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.519369 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:39.519475 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.519434 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.521138 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.521109 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:39.521580 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.521389 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cshbk\"" Apr 16 18:17:39.521580 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.521433 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:39.523375 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.523357 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.523479 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.523450 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:39.523537 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.523491 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.525526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.525483 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7w8xx\"" Apr 16 18:17:39.525526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.525483 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:39.525669 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.525532 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.525669 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.525653 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.525780 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.525740 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.528518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.528093 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.528518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.528384 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.528518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.528473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-j2zj2\"" Apr 16 18:17:39.528871 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.528834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.530665 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.530647 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhm2v\"" Apr 16 18:17:39.530924 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.530905 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.531193 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.531174 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.531776 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.531755 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.533867 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.533847 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:39.533994 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.533970 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5n5rs\"" Apr 16 18:17:39.534135 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534116 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:39.534212 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.534181 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:39.534212 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534199 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.534318 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534281 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.534432 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-os-release\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534499 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534474 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-netns\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534550 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.534550 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534524 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-modprobe-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.534550 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-sys\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.534675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-host\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.534675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-system-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534620 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-k8s-cni-cncf-io\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-hostroot\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-etc-kubernetes\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-device-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvbs\" (UniqueName: \"kubernetes.io/projected/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kube-api-access-xcvbs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cni-binary-copy\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534835 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-socket-dir-parent\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534871 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4dh\" (UniqueName: \"kubernetes.io/projected/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-kube-api-access-2k4dh\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.534893 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-cnibin\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-os-release\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-socket-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-kubernetes\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-conf\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.534986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-bin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535005 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-multus-certs\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535084 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d26\" (UniqueName: \"kubernetes.io/projected/9ff1abd4-4c73-4895-854f-6aa240273e76-kube-api-access-s2d26\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-lib-modules\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-tuned\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-multus\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535188 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrdxv\" (UniqueName: \"kubernetes.io/projected/a84444e3-6cab-4290-a61c-c01132150e31-kube-api-access-qrdxv\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-run\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-tmp\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmgm\" (UniqueName: \"kubernetes.io/projected/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-kube-api-access-4wmgm\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-kubelet\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-daemon-config\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-system-cni-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535418 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-registration-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535439 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysconfig\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-systemd\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-var-lib-kubelet\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535510 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.535596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.536183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-sys-fs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.536183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-conf-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.536183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.536183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.535624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cnibin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.536473 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.536456 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.537888 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.537870 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:39.538337 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.538318 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.538518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.538500 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:39.538581 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.538535 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.538581 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.538541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-v79tc\"" Apr 16 18:17:39.538968 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.538950 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.540974 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.540955 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:39.541085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541014 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql4vk\"" Apr 16 18:17:39.541316 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541291 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:39.541378 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:39.541378 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541325 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.541378 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541350 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:39.541881 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541860 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:39.541980 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.541872 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:39.543326 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.543307 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:39.543478 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.543417 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:39.543583 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.543568 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gjs6b\"" Apr 16 18:17:39.568338 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.568302 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:38 +0000 UTC" deadline="2028-01-17 00:06:02.24523637 +0000 UTC" Apr 16 18:17:39.568338 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.568335 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15365h48m22.676904889s" Apr 16 18:17:39.625465 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.625437 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:39.636672 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-hosts-file\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.636672 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-system-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-etc-kubernetes\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-etc-kubernetes\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-device-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-system-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cni-binary-copy\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-device-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.636910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-socket-dir-parent\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4dh\" (UniqueName: \"kubernetes.io/projected/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-kube-api-access-2k4dh\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-socket-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.636966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-netd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-socket-dir-parent\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-multus-certs\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-multus-certs\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-socket-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.637244 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-tuned\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3216a58-ad89-4814-b5b4-7ae5bf98510e-serviceca\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-multus\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637382 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-multus\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cni-binary-copy\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637492 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-systemd-units\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-slash\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-var-lib-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-cni-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-kubelet\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637579 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-daemon-config\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637606 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-kubelet\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-registration-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.637717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-systemd\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-var-lib-kubelet\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-netns\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637686 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-registration-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-systemd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-var-lib-kubelet\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-systemd\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637760 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-sys-fs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6k6\" (UniqueName: \"kubernetes.io/projected/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-kube-api-access-rg6k6\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.637854 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637902 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-sys-fs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.637856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-kubelet\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.637968 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.137915236 +0000 UTC m=+3.045495263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-env-overrides\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-script-lib\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.638494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-conf-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-daemon-config\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-multus-conf-dir\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-log-socket\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5e2592f-866e-429c-bc0a-2d1321bfc52d-iptables-alerter-script\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-modprobe-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-sys\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3216a58-ad89-4814-b5b4-7ae5bf98510e-host\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-ovn\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-sys\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-k8s-cni-cncf-io\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-modprobe-d\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-k8s-cni-cncf-io\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639225 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-hostroot\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvbs\" (UniqueName: \"kubernetes.io/projected/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kube-api-access-xcvbs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-hostroot\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-binary-copy\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-kubernetes\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-conf\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-run\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-agent-certs\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-kubernetes\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-cnibin\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-run\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638799 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-os-release\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-cnibin\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-tmp-dir\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-os-release\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.639885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysctl-conf\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-etc-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-node-log\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638946 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-bin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d26\" (UniqueName: \"kubernetes.io/projected/9ff1abd4-4c73-4895-854f-6aa240273e76-kube-api-access-s2d26\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.638998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-lib-modules\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639024 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-config\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-var-lib-cni-bin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrdxv\" (UniqueName: \"kubernetes.io/projected/a84444e3-6cab-4290-a61c-c01132150e31-kube-api-access-qrdxv\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-tmp\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-lib-modules\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmgm\" (UniqueName: \"kubernetes.io/projected/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-kube-api-access-4wmgm\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmljj\" (UniqueName: \"kubernetes.io/projected/a3216a58-ad89-4814-b5b4-7ae5bf98510e-kube-api-access-dmljj\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfzn\" (UniqueName: \"kubernetes.io/projected/f5e2592f-866e-429c-bc0a-2d1321bfc52d-kube-api-access-lxfzn\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-system-cni-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639245 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.640459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysconfig\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-konnectivity-ca\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-system-cni-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5e2592f-866e-429c-bc0a-2d1321bfc52d-host-slash\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-sysconfig\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpkqh\" (UniqueName: \"kubernetes.io/projected/e7b29382-768b-4aa5-a896-07f32fc4d4e6-kube-api-access-tpkqh\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639419 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cnibin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-os-release\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639485 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-netns\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-host\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639529 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-bin\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-cnibin\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-host-run-netns\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-os-release\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-host\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.641028 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639736 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ff1abd4-4c73-4895-854f-6aa240273e76-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.641687 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.639963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ff1abd4-4c73-4895-854f-6aa240273e76-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.641687 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.641079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-etc-tuned\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.641687 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.641357 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-tmp\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.645889 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.645865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4dh\" (UniqueName: \"kubernetes.io/projected/5eb4032a-87eb-4b5c-955d-2f54c02cd4de-kube-api-access-2k4dh\") pod \"multus-5gncp\" (UID: \"5eb4032a-87eb-4b5c-955d-2f54c02cd4de\") " pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.649061 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.649037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvbs\" (UniqueName: \"kubernetes.io/projected/e80b30e3-7c06-4a7b-a690-65a3e682b4d1-kube-api-access-xcvbs\") pod \"aws-ebs-csi-driver-node-m67rf\" (UID: \"e80b30e3-7c06-4a7b-a690-65a3e682b4d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.650042 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.650021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmgm\" (UniqueName: \"kubernetes.io/projected/b93efba3-fc4f-41b6-82a3-6c4f89f9a12d-kube-api-access-4wmgm\") pod \"tuned-dkclb\" (UID: \"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d\") " pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.650326 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.650280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d26\" (UniqueName: \"kubernetes.io/projected/9ff1abd4-4c73-4895-854f-6aa240273e76-kube-api-access-s2d26\") pod \"multus-additional-cni-plugins-kx5hq\" (UID: \"9ff1abd4-4c73-4895-854f-6aa240273e76\") " pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.650611 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.650588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrdxv\" (UniqueName: \"kubernetes.io/projected/a84444e3-6cab-4290-a61c-c01132150e31-kube-api-access-qrdxv\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:39.690341 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.690309 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:39.740731 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3216a58-ad89-4814-b5b4-7ae5bf98510e-serviceca\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.740731 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740739 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-systemd-units\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-slash\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-var-lib-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-netns\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-systemd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.740957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-systemd-units\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-slash\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.740980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg6k6\" (UniqueName: \"kubernetes.io/projected/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-kube-api-access-rg6k6\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-kubelet\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-netns\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741012 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-var-lib-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-systemd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-kubelet\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-env-overrides\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-script-lib\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-log-socket\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5e2592f-866e-429c-bc0a-2d1321bfc52d-iptables-alerter-script\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3216a58-ad89-4814-b5b4-7ae5bf98510e-host\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741226 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3216a58-ad89-4814-b5b4-7ae5bf98510e-host\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3216a58-ad89-4814-b5b4-7ae5bf98510e-serviceca\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-ovn\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741265 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-log-socket\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-run-ovn\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-agent-certs\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-tmp-dir\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-etc-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-node-log\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-config\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741491 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-etc-openvswitch\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-node-log\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-env-overrides\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741502 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmljj\" (UniqueName: \"kubernetes.io/projected/a3216a58-ad89-4814-b5b4-7ae5bf98510e-kube-api-access-dmljj\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfzn\" (UniqueName: \"kubernetes.io/projected/f5e2592f-866e-429c-bc0a-2d1321bfc52d-kube-api-access-lxfzn\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-konnectivity-ca\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.741837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5e2592f-866e-429c-bc0a-2d1321bfc52d-host-slash\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f5e2592f-866e-429c-bc0a-2d1321bfc52d-iptables-alerter-script\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpkqh\" (UniqueName: \"kubernetes.io/projected/e7b29382-768b-4aa5-a896-07f32fc4d4e6-kube-api-access-tpkqh\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-bin\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-hosts-file\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-tmp-dir\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.741996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-netd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f5e2592f-866e-429c-bc0a-2d1321bfc52d-host-slash\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-netd\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742217 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-config\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742226 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-cni-bin\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7b29382-768b-4aa5-a896-07f32fc4d4e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.742526 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-hosts-file\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.743164 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.742740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovnkube-script-lib\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.743268 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.743240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-konnectivity-ca\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.744053 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.744034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d5e8b6a3-f88f-40cd-be49-7c8a4efe8164-agent-certs\") pod \"konnectivity-agent-z5qzj\" (UID: \"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164\") " pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:39.744355 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.744331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7b29382-768b-4aa5-a896-07f32fc4d4e6-ovn-node-metrics-cert\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.749009 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.748983 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:39.749113 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.749017 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:39.749113 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.749031 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:39.749113 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:39.749106 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.249088064 +0000 UTC m=+3.156668093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:39.750934 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.750902 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfzn\" (UniqueName: \"kubernetes.io/projected/f5e2592f-866e-429c-bc0a-2d1321bfc52d-kube-api-access-lxfzn\") pod \"iptables-alerter-lr9r2\" (UID: \"f5e2592f-866e-429c-bc0a-2d1321bfc52d\") " pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.751134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.751114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmljj\" (UniqueName: \"kubernetes.io/projected/a3216a58-ad89-4814-b5b4-7ae5bf98510e-kube-api-access-dmljj\") pod \"node-ca-g2bjm\" (UID: \"a3216a58-ad89-4814-b5b4-7ae5bf98510e\") " pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.751786 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.751763 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpkqh\" (UniqueName: \"kubernetes.io/projected/e7b29382-768b-4aa5-a896-07f32fc4d4e6-kube-api-access-tpkqh\") pod \"ovnkube-node-zkthf\" (UID: \"e7b29382-768b-4aa5-a896-07f32fc4d4e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.754033 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.753984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg6k6\" (UniqueName: \"kubernetes.io/projected/0c49a704-d49e-48e8-a3bf-2cbbf59da5a5-kube-api-access-rg6k6\") pod \"node-resolver-sfjlw\" (UID: \"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5\") " pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.829878 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.829835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5gncp" Apr 16 18:17:39.836764 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.836737 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" Apr 16 18:17:39.846676 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.846634 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sfjlw" Apr 16 18:17:39.853412 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.853369 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" Apr 16 18:17:39.860048 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.860019 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dkclb" Apr 16 18:17:39.867730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.867701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g2bjm" Apr 16 18:17:39.876422 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.876378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr9r2" Apr 16 18:17:39.883202 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.883173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:17:39.886840 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:39.886820 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:40.144904 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.144827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:40.145050 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.144958 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:40.145050 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.145026 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.145010396 +0000 UTC m=+4.052590403 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:40.264968 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.264863 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c49a704_d49e_48e8_a3bf_2cbbf59da5a5.slice/crio-2ebde57d1bc7424fb64e17c2bc4ecd430e98d51c3c4201cb8577541a6dc1babe WatchSource:0}: Error finding container 2ebde57d1bc7424fb64e17c2bc4ecd430e98d51c3c4201cb8577541a6dc1babe: Status 404 returned error can't find the container with id 2ebde57d1bc7424fb64e17c2bc4ecd430e98d51c3c4201cb8577541a6dc1babe Apr 16 18:17:40.266453 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.266431 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e2592f_866e_429c_bc0a_2d1321bfc52d.slice/crio-f4bfd8a1aa8e700c93dbfa7961590ef77a2d0e14e313c9a50bc6d7077ea44252 WatchSource:0}: Error finding container f4bfd8a1aa8e700c93dbfa7961590ef77a2d0e14e313c9a50bc6d7077ea44252: Status 404 returned error can't find the container with id f4bfd8a1aa8e700c93dbfa7961590ef77a2d0e14e313c9a50bc6d7077ea44252 Apr 16 18:17:40.267587 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.267485 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff1abd4_4c73_4895_854f_6aa240273e76.slice/crio-aa36dfe81ffa5c7552649f7c829dae38682469d658d9563521ff58847fc1fe6a WatchSource:0}: Error finding container aa36dfe81ffa5c7552649f7c829dae38682469d658d9563521ff58847fc1fe6a: Status 404 returned error can't find the container with id aa36dfe81ffa5c7552649f7c829dae38682469d658d9563521ff58847fc1fe6a Apr 16 18:17:40.270484 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.270464 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80b30e3_7c06_4a7b_a690_65a3e682b4d1.slice/crio-1008295b6e597f9025e84320945df34b4039dfcd4741fbb56bd45ed894a6b8c8 WatchSource:0}: Error finding container 1008295b6e597f9025e84320945df34b4039dfcd4741fbb56bd45ed894a6b8c8: Status 404 returned error can't find the container with id 1008295b6e597f9025e84320945df34b4039dfcd4741fbb56bd45ed894a6b8c8 Apr 16 18:17:40.272387 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.272348 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5e8b6a3_f88f_40cd_be49_7c8a4efe8164.slice/crio-99ebc7fb3be83c6ba5a4af160f737e34a01c03acd55d62d32c75df15ef5aaf12 WatchSource:0}: Error finding container 99ebc7fb3be83c6ba5a4af160f737e34a01c03acd55d62d32c75df15ef5aaf12: Status 404 returned error can't find the container with id 99ebc7fb3be83c6ba5a4af160f737e34a01c03acd55d62d32c75df15ef5aaf12 Apr 16 18:17:40.274093 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:17:40.273725 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3216a58_ad89_4814_b5b4_7ae5bf98510e.slice/crio-4d3b6792d5d95b0d38a3553557b59d872a59960363120fb0f9723b76b0fc886b WatchSource:0}: Error finding container 4d3b6792d5d95b0d38a3553557b59d872a59960363120fb0f9723b76b0fc886b: Status 404 returned error can't find the container with id 4d3b6792d5d95b0d38a3553557b59d872a59960363120fb0f9723b76b0fc886b Apr 16 18:17:40.346297 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.346120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:40.346297 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.346281 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:40.346510 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.346327 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:40.346510 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.346339 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:40.346510 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:40.346453 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:41.346432124 +0000 UTC m=+4.254012131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:40.568512 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.568474 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:38 +0000 UTC" deadline="2027-12-24 17:00:10.00190296 +0000 UTC" Apr 16 18:17:40.568512 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.568507 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14806h42m29.433399164s" Apr 16 18:17:40.678052 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.678003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5gncp" event={"ID":"5eb4032a-87eb-4b5c-955d-2f54c02cd4de","Type":"ContainerStarted","Data":"0216630be31b4b744f22b810452c660811417763fe9435b0c1651e60f80215e5"} Apr 16 18:17:40.682894 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.682834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"95285c05306479305568c31cab53d824285add68c6606ea5d4527f85f4d6f8e7"} Apr 16 18:17:40.686176 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.686110 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g2bjm" event={"ID":"a3216a58-ad89-4814-b5b4-7ae5bf98510e","Type":"ContainerStarted","Data":"4d3b6792d5d95b0d38a3553557b59d872a59960363120fb0f9723b76b0fc886b"} Apr 16 18:17:40.691861 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.691374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z5qzj" event={"ID":"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164","Type":"ContainerStarted","Data":"99ebc7fb3be83c6ba5a4af160f737e34a01c03acd55d62d32c75df15ef5aaf12"} Apr 16 18:17:40.700802 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.700760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" event={"ID":"e80b30e3-7c06-4a7b-a690-65a3e682b4d1","Type":"ContainerStarted","Data":"1008295b6e597f9025e84320945df34b4039dfcd4741fbb56bd45ed894a6b8c8"} Apr 16 18:17:40.704609 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.704546 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerStarted","Data":"aa36dfe81ffa5c7552649f7c829dae38682469d658d9563521ff58847fc1fe6a"} Apr 16 18:17:40.710853 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.710076 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" event={"ID":"b69b580532cc9a5ba4501f776b0e392e","Type":"ContainerStarted","Data":"a34fbba0fcf31db41dc28e2f7c972898af8b8b20f91f5cf3ec6c9192ab6d4d64"} Apr 16 18:17:40.717010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.716979 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dkclb" event={"ID":"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d","Type":"ContainerStarted","Data":"2634b3b07025b3cc59ce19b9c2dc8df93d42b1f351e3f7b2b95f59842ff0f54b"} Apr 16 18:17:40.724747 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.724718 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr9r2" event={"ID":"f5e2592f-866e-429c-bc0a-2d1321bfc52d","Type":"ContainerStarted","Data":"f4bfd8a1aa8e700c93dbfa7961590ef77a2d0e14e313c9a50bc6d7077ea44252"} Apr 16 18:17:40.730075 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:40.730045 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sfjlw" event={"ID":"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5","Type":"ContainerStarted","Data":"2ebde57d1bc7424fb64e17c2bc4ecd430e98d51c3c4201cb8577541a6dc1babe"} Apr 16 18:17:41.152802 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.152733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:41.153347 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.153327 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:41.153460 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.153411 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:43.153377418 +0000 UTC m=+6.060957432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:41.357818 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.356988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:41.357818 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.357171 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:41.357818 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.357190 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:41.357818 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.357203 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:41.357818 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.357264 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:43.357245476 +0000 UTC m=+6.264825488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:41.667923 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.667219 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:41.667923 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.667355 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:41.667923 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.667777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:41.667923 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:41.667863 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:41.743615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.743551 2579 generic.go:358] "Generic (PLEG): container finished" podID="66311dffd58d8401224540c8f2917b24" containerID="be9b0ede0bba38095057b1795ee40001ce12e7173fa47fe28626e78c0cb7340a" exitCode=0 Apr 16 18:17:41.744584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.744540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" event={"ID":"66311dffd58d8401224540c8f2917b24","Type":"ContainerDied","Data":"be9b0ede0bba38095057b1795ee40001ce12e7173fa47fe28626e78c0cb7340a"} Apr 16 18:17:41.759473 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:41.759386 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-88.ec2.internal" podStartSLOduration=3.759364341 podStartE2EDuration="3.759364341s" podCreationTimestamp="2026-04-16 18:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:40.726344629 +0000 UTC m=+3.633924659" watchObservedRunningTime="2026-04-16 18:17:41.759364341 +0000 UTC m=+4.666944372" Apr 16 18:17:42.756353 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:42.756315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" event={"ID":"66311dffd58d8401224540c8f2917b24","Type":"ContainerStarted","Data":"c780d1fa0a425873ff0245c494f1cf49e6836ee1472d00b963c512f05479d433"} Apr 16 18:17:43.175361 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:43.175269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:43.175532 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.175480 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:43.175599 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.175548 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.17552786 +0000 UTC m=+10.083107870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:43.377324 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:43.377283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:43.377519 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.377495 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:43.377598 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.377524 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:43.377598 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.377539 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:43.377693 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.377602 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.377582165 +0000 UTC m=+10.285162175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:43.669376 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:43.669343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:43.669578 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.669470 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:43.669578 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:43.669343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:43.669578 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:43.669573 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:45.664505 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:45.664470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:45.664961 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:45.664611 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:45.664961 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:45.664811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:45.664961 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:45.664949 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:47.046503 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.046449 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-88.ec2.internal" podStartSLOduration=9.046432155 podStartE2EDuration="9.046432155s" podCreationTimestamp="2026-04-16 18:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:42.771746241 +0000 UTC m=+5.679326272" watchObservedRunningTime="2026-04-16 18:17:47.046432155 +0000 UTC m=+9.954012185" Apr 16 18:17:47.047095 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.047023 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nl9sg"] Apr 16 18:17:47.057105 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.057059 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.057243 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.057190 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:47.109537 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.109431 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-kubelet-config\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.109537 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.109503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.109537 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.109536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-dbus\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.210509 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.210471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-dbus\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.210699 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.210543 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:47.210699 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.210583 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-kubelet-config\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.210699 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.210614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.210839 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.210733 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:47.210839 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.210799 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:17:47.710778154 +0000 UTC m=+10.618358181 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:47.210940 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.210834 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:47.210940 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.210907 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.210879861 +0000 UTC m=+18.118459872 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:47.211084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.211065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-dbus\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.211084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.211077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-kubelet-config\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.412766 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.412678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:47.412953 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.412868 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:47.412953 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.412893 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:47.412953 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.412909 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:47.413116 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.412975 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.412954838 +0000 UTC m=+18.320534862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:47.665538 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.665453 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:47.665714 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.665572 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:47.665972 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.665947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:47.666088 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.666055 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:47.716089 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:47.715463 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:47.716089 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.715692 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:47.716089 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:47.715755 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:17:48.715735218 +0000 UTC m=+11.623315238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:48.664604 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:48.664570 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:48.665095 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:48.664708 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:48.724887 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:48.724846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:48.725099 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:48.724998 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:48.725099 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:48.725070 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:17:50.725051189 +0000 UTC m=+13.632631209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:49.664562 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:49.664476 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:49.664562 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:49.664476 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:49.665077 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:49.664622 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:49.665077 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:49.664747 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:50.664861 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:50.664823 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:50.665322 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:50.664958 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:50.742771 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:50.742738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:50.742959 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:50.742891 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:50.743022 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:50.742968 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:17:54.742948956 +0000 UTC m=+17.650528984 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:51.664881 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:51.664844 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:51.664881 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:51.664880 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:51.665379 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:51.664991 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:51.665379 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:51.665114 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:52.664467 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:52.664424 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:52.664661 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:52.664574 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:53.664704 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:53.664663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:53.665132 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:53.664663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:53.665132 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:53.664825 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:53.665132 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:53.664958 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:54.664706 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:54.664671 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:54.665221 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:54.664789 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:54.768688 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:54.768647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:54.768859 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:54.768756 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:54.768859 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:54.768814 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:18:02.768800405 +0000 UTC m=+25.676380417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:55.272261 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:55.272222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:55.272482 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.272419 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:55.272549 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.272499 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.272478155 +0000 UTC m=+34.180058166 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:55.473548 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:55.473507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:55.473755 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.473708 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:55.473755 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.473729 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:55.473755 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.473740 2579 projected.go:194] Error preparing data for projected volume kube-api-access-4gqlh for pod openshift-network-diagnostics/network-check-target-8wxmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:55.473916 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.473800 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh podName:afad1e88-aa25-44f7-8893-4eac3477f6c8 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.473779398 +0000 UTC m=+34.381359404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gqlh" (UniqueName: "kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh") pod "network-check-target-8wxmz" (UID: "afad1e88-aa25-44f7-8893-4eac3477f6c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:55.664808 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:55.664709 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:55.664808 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:55.664741 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:55.665342 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.664855 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:55.665342 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:55.664943 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:56.664996 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:56.664962 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:56.665378 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:56.665087 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:57.665530 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.665346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:57.665858 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.665429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:57.665858 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:57.665715 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:57.665858 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:57.665592 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:57.791298 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.791264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z5qzj" event={"ID":"d5e8b6a3-f88f-40cd-be49-7c8a4efe8164","Type":"ContainerStarted","Data":"f1cb25223a098b242b1d2ecbea5ac811ac7117ad1cd7d0f674c4454a2c723b4b"} Apr 16 18:17:57.793258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.793228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" event={"ID":"e80b30e3-7c06-4a7b-a690-65a3e682b4d1","Type":"ContainerStarted","Data":"b6e352197f6045f4037a1683126e46315cf4890fd5605b7883251e0b41a8a486"} Apr 16 18:17:57.794659 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.794635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerStarted","Data":"e30d93e1218e4153dcf7e05c5eb0bc98668eeded7d20cbc2612a57118099d6a2"} Apr 16 18:17:57.796145 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.796116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dkclb" event={"ID":"b93efba3-fc4f-41b6-82a3-6c4f89f9a12d","Type":"ContainerStarted","Data":"046e0c179c17f3f701bd8662a01f5c1cf56334738a0f975a4f83efe5b46c2305"} Apr 16 18:17:57.797629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.797587 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sfjlw" event={"ID":"0c49a704-d49e-48e8-a3bf-2cbbf59da5a5","Type":"ContainerStarted","Data":"a6376efc60ca32a641106960e313ca1569d77aa4af26539672e1831818400f39"} Apr 16 18:17:57.799049 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.799028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5gncp" event={"ID":"5eb4032a-87eb-4b5c-955d-2f54c02cd4de","Type":"ContainerStarted","Data":"d8cd6fdf97c48cc2d91f326401ce6175c419e23b8344666c9a973f1cc24b3943"} Apr 16 18:17:57.802882 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.802861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"090013a34d5cc1cd327b0d623be3d4968bcaaa449036c45dfee2ec3170509162"} Apr 16 18:17:57.804326 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.804285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g2bjm" event={"ID":"a3216a58-ad89-4814-b5b4-7ae5bf98510e","Type":"ContainerStarted","Data":"d58b6acdf601eee0b04f20992c5ced020f5ef79a88bd0dd2fc203e092aaad30e"} Apr 16 18:17:57.854219 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.854153 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g2bjm" podStartSLOduration=3.755624418 podStartE2EDuration="20.854132094s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.275913272 +0000 UTC m=+3.183493279" lastFinishedPulling="2026-04-16 18:17:57.374420941 +0000 UTC m=+20.282000955" observedRunningTime="2026-04-16 18:17:57.85400994 +0000 UTC m=+20.761589985" watchObservedRunningTime="2026-04-16 18:17:57.854132094 +0000 UTC m=+20.761712139" Apr 16 18:17:57.854675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.854634 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z5qzj" podStartSLOduration=3.976108113 podStartE2EDuration="20.854623511s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.274984377 +0000 UTC m=+3.182564387" lastFinishedPulling="2026-04-16 18:17:57.153499778 +0000 UTC m=+20.061079785" observedRunningTime="2026-04-16 18:17:57.828577166 +0000 UTC m=+20.736157217" watchObservedRunningTime="2026-04-16 18:17:57.854623511 +0000 UTC m=+20.762203541" Apr 16 18:17:57.951586 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.951365 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dkclb" podStartSLOduration=4.076647708 podStartE2EDuration="20.951348434s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.27882268 +0000 UTC m=+3.186402690" lastFinishedPulling="2026-04-16 18:17:57.153523396 +0000 UTC m=+20.061103416" observedRunningTime="2026-04-16 18:17:57.928152962 +0000 UTC m=+20.835732992" watchObservedRunningTime="2026-04-16 18:17:57.951348434 +0000 UTC m=+20.858928462" Apr 16 18:17:57.951747 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.951687 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sfjlw" podStartSLOduration=3.855936739 podStartE2EDuration="20.951675304s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.266354101 +0000 UTC m=+3.173934125" lastFinishedPulling="2026-04-16 18:17:57.362092669 +0000 UTC m=+20.269672690" observedRunningTime="2026-04-16 18:17:57.951331637 +0000 UTC m=+20.858911666" watchObservedRunningTime="2026-04-16 18:17:57.951675304 +0000 UTC m=+20.859255333" Apr 16 18:17:57.987046 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:57.986992 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5gncp" podStartSLOduration=3.869650646 podStartE2EDuration="20.986972521s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.279134804 +0000 UTC m=+3.186714811" lastFinishedPulling="2026-04-16 18:17:57.396456678 +0000 UTC m=+20.304036686" observedRunningTime="2026-04-16 18:17:57.984991786 +0000 UTC m=+20.892571845" watchObservedRunningTime="2026-04-16 18:17:57.986972521 +0000 UTC m=+20.894552548" Apr 16 18:17:58.467790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.467765 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:58.507606 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.507571 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:58.508209 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.508192 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:58.598273 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.598144 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:17:58.467784689Z","UUID":"cf079804-4244-4195-b34e-0028ba9f8394","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:58.599751 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.599733 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:58.599828 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.599756 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:58.664761 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.664725 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:17:58.664907 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:58.664830 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:17:58.808351 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.808319 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr9r2" event={"ID":"f5e2592f-866e-429c-bc0a-2d1321bfc52d","Type":"ContainerStarted","Data":"a755cda8b7287a6db8ea3a48dd918f086fa189f2b7111bba1744ecc7fe5b9631"} Apr 16 18:17:58.811477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.811383 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"96c1c5835f14d9c0c60345a8cd839516d9bbe8a1661b6bb687542d751a9956d1"} Apr 16 18:17:58.811477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.811438 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"de30d7c3e4ed1669c6723da4ba261c53ce746e50c4d33c4a96baec2559e702e6"} Apr 16 18:17:58.811477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.811453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"b86a1c0424659bcc01ddb8907538c721a025b44a073c5f3b71bd270913b798d4"} Apr 16 18:17:58.811477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.811466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"96ca759f9b3336bf64b4039a822ecf45f49fad72c2ef9d17bce5f1d31403cd13"} Apr 16 18:17:58.811477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.811480 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"3d5ec7568052aea2363699f630f4fb29be47f7314182d88453e9157acf0e42af"} Apr 16 18:17:58.813091 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.813064 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" event={"ID":"e80b30e3-7c06-4a7b-a690-65a3e682b4d1","Type":"ContainerStarted","Data":"f41778c9b6a290b36de0bce847f966916df03f2023577261ab5e88f095363c7c"} Apr 16 18:17:58.814530 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.814503 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="e30d93e1218e4153dcf7e05c5eb0bc98668eeded7d20cbc2612a57118099d6a2" exitCode=0 Apr 16 18:17:58.814697 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.814668 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"e30d93e1218e4153dcf7e05c5eb0bc98668eeded7d20cbc2612a57118099d6a2"} Apr 16 18:17:58.815908 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.815502 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:58.816168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.816150 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z5qzj" Apr 16 18:17:58.827629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:58.827569 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lr9r2" podStartSLOduration=4.720882639 podStartE2EDuration="21.827550143s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.269087576 +0000 UTC m=+3.176667586" lastFinishedPulling="2026-04-16 18:17:57.375755069 +0000 UTC m=+20.283335090" observedRunningTime="2026-04-16 18:17:58.827010024 +0000 UTC m=+21.734590054" watchObservedRunningTime="2026-04-16 18:17:58.827550143 +0000 UTC m=+21.735130174" Apr 16 18:17:59.664518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:59.664483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:17:59.664752 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:59.664483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:17:59.664752 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:59.664623 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:17:59.664752 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:17:59.664692 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:17:59.818218 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:17:59.818182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" event={"ID":"e80b30e3-7c06-4a7b-a690-65a3e682b4d1","Type":"ContainerStarted","Data":"3eed9868137b2f90a7ade7b2de03b632a74ae5c338563724d4df68fc4c3a2194"} Apr 16 18:18:00.665013 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:00.664979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:00.665204 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:00.665109 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:00.823574 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:00.823532 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"9dd08c26938e914bdf5db86b630600fb195f98bb9f67ba8acdd6061da40f128e"} Apr 16 18:18:01.664596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:01.664390 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:01.664778 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:01.664386 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:01.664778 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:01.664716 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:18:01.664778 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:01.664734 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:18:02.664618 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.664587 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:02.665100 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:02.664693 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:02.828572 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.828538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:02.828758 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:02.828720 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:18:02.828869 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:02.828800 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret podName:d0db8fbf-eebe-4eb1-84d2-ef97f04477fa nodeName:}" failed. No retries permitted until 2026-04-16 18:18:18.8287805 +0000 UTC m=+41.736360520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret") pod "global-pull-secret-syncer-nl9sg" (UID: "d0db8fbf-eebe-4eb1-84d2-ef97f04477fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:18:02.831280 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.831249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" event={"ID":"e7b29382-768b-4aa5-a896-07f32fc4d4e6","Type":"ContainerStarted","Data":"d63066a8e3b564c4dd037307f3c93cd367593abd7e307a7dc4088b59f46936e9"} Apr 16 18:18:02.831935 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.831720 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:02.831935 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.831750 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:02.847363 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.847338 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:02.847501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.847484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:02.866505 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.866456 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" podStartSLOduration=8.430044962 podStartE2EDuration="25.866436206s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.276711541 +0000 UTC m=+3.184291553" lastFinishedPulling="2026-04-16 18:17:57.713102788 +0000 UTC m=+20.620682797" observedRunningTime="2026-04-16 18:18:02.866206455 +0000 UTC m=+25.773786487" watchObservedRunningTime="2026-04-16 18:18:02.866436206 +0000 UTC m=+25.774016233" Apr 16 18:18:02.866954 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:02.866925 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m67rf" podStartSLOduration=6.966981533 podStartE2EDuration="25.866916759s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.27465236 +0000 UTC m=+3.182232369" lastFinishedPulling="2026-04-16 18:17:59.174587585 +0000 UTC m=+22.082167595" observedRunningTime="2026-04-16 18:17:59.846150771 +0000 UTC m=+22.753730803" watchObservedRunningTime="2026-04-16 18:18:02.866916759 +0000 UTC m=+25.774496898" Apr 16 18:18:03.664698 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:03.664503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:03.665311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:03.664504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:03.665311 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:03.664772 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:18:03.665311 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:03.664871 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:18:03.834327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:03.834291 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="691b88ce78dcf4085c026113621f350f23679cab20f19d792e646db170f32b23" exitCode=0 Apr 16 18:18:03.834533 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:03.834379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"691b88ce78dcf4085c026113621f350f23679cab20f19d792e646db170f32b23"} Apr 16 18:18:03.834620 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:03.834607 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:18:04.430701 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.430672 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:04.658252 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.658214 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8wxmz"] Apr 16 18:18:04.658444 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.658314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:04.658444 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:04.658425 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:18:04.660608 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.660583 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h2jgj"] Apr 16 18:18:04.660729 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.660669 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:04.660792 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:04.660774 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:18:04.664722 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.664702 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:04.665019 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:04.664788 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:04.673201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.673177 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nl9sg"] Apr 16 18:18:04.837850 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.837814 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="024fb5b4c3bef8a1f5bb8251443c0dd086ea0d6dcf438d01a685b8ca75dcfbe2" exitCode=0 Apr 16 18:18:04.838035 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.837905 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"024fb5b4c3bef8a1f5bb8251443c0dd086ea0d6dcf438d01a685b8ca75dcfbe2"} Apr 16 18:18:04.838035 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:04.838001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:04.838210 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:04.838183 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:05.842387 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:05.842354 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="5f1fd4db35a60592510f447411aef04cbd710589b7c27b810589ce2e3528d67c" exitCode=0 Apr 16 18:18:05.842772 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:05.842441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"5f1fd4db35a60592510f447411aef04cbd710589b7c27b810589ce2e3528d67c"} Apr 16 18:18:06.664825 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:06.664742 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:06.664973 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:06.664748 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:06.664973 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:06.664859 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:06.664973 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:06.664956 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:18:06.665122 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:06.664756 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:06.665122 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:06.665053 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:18:08.665016 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:08.664927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:08.665464 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:08.664934 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:08.665464 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:08.665060 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nl9sg" podUID="d0db8fbf-eebe-4eb1-84d2-ef97f04477fa" Apr 16 18:18:08.665464 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:08.664942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:08.665464 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:08.665142 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8wxmz" podUID="afad1e88-aa25-44f7-8893-4eac3477f6c8" Apr 16 18:18:08.665464 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:08.665230 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:18:10.445494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.445420 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-88.ec2.internal" event="NodeReady" Apr 16 18:18:10.446015 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.445558 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:18:10.492497 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.492461 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mh6jk"] Apr 16 18:18:10.519417 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.519311 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t9hjb"] Apr 16 18:18:10.519666 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.519490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.521891 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.521865 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:18:10.522152 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.522107 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:18:10.522545 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.522528 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:18:10.535294 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.535276 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t9hjb"] Apr 16 18:18:10.535294 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.535296 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mh6jk"] Apr 16 18:18:10.535448 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.535373 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:10.539943 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.539917 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:18:10.540054 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.539918 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:18:10.540054 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.540000 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:18:10.540054 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.539945 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:18:10.586277 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.586250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.586481 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.586288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-tmp-dir\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.586481 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.586336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-config-volume\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.586606 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.586464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtc4v\" (UniqueName: \"kubernetes.io/projected/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-kube-api-access-vtc4v\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.664297 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.664249 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:10.664503 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.664251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:10.664604 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.664249 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:10.667246 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667219 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:18:10.667590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:18:10.667590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667513 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:18:10.667590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667558 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:18:10.667590 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667586 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:18:10.667907 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.667892 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kzxcx\"" Apr 16 18:18:10.687496 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.687615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-tmp-dir\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.687615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687531 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:10.687711 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:10.687623 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:10.687711 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-config-volume\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.687711 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:10.687693 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.187667725 +0000 UTC m=+34.095247734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:10.687851 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dnqw\" (UniqueName: \"kubernetes.io/projected/d06241a9-55bd-4260-9855-06114156f4d2-kube-api-access-5dnqw\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:10.687851 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-tmp-dir\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.687851 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.687815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtc4v\" (UniqueName: \"kubernetes.io/projected/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-kube-api-access-vtc4v\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.688152 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.688136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-config-volume\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.700159 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.700111 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtc4v\" (UniqueName: \"kubernetes.io/projected/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-kube-api-access-vtc4v\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:10.788296 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.788252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dnqw\" (UniqueName: \"kubernetes.io/projected/d06241a9-55bd-4260-9855-06114156f4d2-kube-api-access-5dnqw\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:10.788487 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.788385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:10.788563 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:10.788539 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:10.788660 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:10.788647 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:11.288613315 +0000 UTC m=+34.196193324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:10.798558 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:10.798533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dnqw\" (UniqueName: \"kubernetes.io/projected/d06241a9-55bd-4260-9855-06114156f4d2-kube-api-access-5dnqw\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:11.191093 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.191048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:11.191384 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.191226 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:11.191384 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.191301 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.19128196 +0000 UTC m=+35.098861969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:11.292338 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.292307 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:11.292494 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.292363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:11.292494 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.292476 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:11.292563 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.292540 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:43.292524789 +0000 UTC m=+66.200104800 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : secret "metrics-daemon-secret" not found Apr 16 18:18:11.292610 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.292477 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:11.292653 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:11.292631 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:12.292619372 +0000 UTC m=+35.200199383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:11.494478 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.494445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:11.496971 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.496944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gqlh\" (UniqueName: \"kubernetes.io/projected/afad1e88-aa25-44f7-8893-4eac3477f6c8-kube-api-access-4gqlh\") pod \"network-check-target-8wxmz\" (UID: \"afad1e88-aa25-44f7-8893-4eac3477f6c8\") " pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:11.587238 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.587201 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:11.819864 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.819673 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8wxmz"] Apr 16 18:18:11.834342 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:18:11.834305 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafad1e88_aa25_44f7_8893_4eac3477f6c8.slice/crio-f5f141f3764ff6a5e69a90567002d55aa40cabb817a1f03b7835b287e3b82d3e WatchSource:0}: Error finding container f5f141f3764ff6a5e69a90567002d55aa40cabb817a1f03b7835b287e3b82d3e: Status 404 returned error can't find the container with id f5f141f3764ff6a5e69a90567002d55aa40cabb817a1f03b7835b287e3b82d3e Apr 16 18:18:11.854848 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:11.854823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8wxmz" event={"ID":"afad1e88-aa25-44f7-8893-4eac3477f6c8","Type":"ContainerStarted","Data":"f5f141f3764ff6a5e69a90567002d55aa40cabb817a1f03b7835b287e3b82d3e"} Apr 16 18:18:12.199181 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:12.199147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:12.199358 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:12.199268 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:12.199358 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:12.199339 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:14.199323428 +0000 UTC m=+37.106903446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:12.299617 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:12.299535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:12.299762 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:12.299680 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:12.299762 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:12.299742 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:14.299725998 +0000 UTC m=+37.207306004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:12.858315 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:12.858282 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="53af7c3774359f2d0b2e61bbef80f105b3e81208cb801638fd43c82564b95ffe" exitCode=0 Apr 16 18:18:12.858801 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:12.858329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"53af7c3774359f2d0b2e61bbef80f105b3e81208cb801638fd43c82564b95ffe"} Apr 16 18:18:13.864199 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:13.864160 2579 generic.go:358] "Generic (PLEG): container finished" podID="9ff1abd4-4c73-4895-854f-6aa240273e76" containerID="c991085e6127f1383784092ad5760c75d96413ee9b456fdcd97d93958881a952" exitCode=0 Apr 16 18:18:13.864653 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:13.864204 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerDied","Data":"c991085e6127f1383784092ad5760c75d96413ee9b456fdcd97d93958881a952"} Apr 16 18:18:14.215507 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:14.215466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:14.215686 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:14.215644 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:14.215749 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:14.215727 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:18.215704873 +0000 UTC m=+41.123284885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:14.316136 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:14.316094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:14.316319 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:14.316264 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:14.316371 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:14.316349 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:18.316331217 +0000 UTC m=+41.223911223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:15.871352 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:15.871316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" event={"ID":"9ff1abd4-4c73-4895-854f-6aa240273e76","Type":"ContainerStarted","Data":"4d8eaff2c8471be28851f4909b3fcfbc8a0f0c07c69735cc2893558217c1fb51"} Apr 16 18:18:15.872719 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:15.872696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8wxmz" event={"ID":"afad1e88-aa25-44f7-8893-4eac3477f6c8","Type":"ContainerStarted","Data":"06f01933304e641f3159a8ad0d3a074f24c0b879b43bdb6f46105979285707ba"} Apr 16 18:18:15.872837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:15.872822 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:18:15.896362 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:15.896312 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kx5hq" podStartSLOduration=7.494978251 podStartE2EDuration="38.896298214s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:17:40.269328226 +0000 UTC m=+3.176908233" lastFinishedPulling="2026-04-16 18:18:11.670648173 +0000 UTC m=+34.578228196" observedRunningTime="2026-04-16 18:18:15.894930674 +0000 UTC m=+38.802510758" watchObservedRunningTime="2026-04-16 18:18:15.896298214 +0000 UTC m=+38.803878222" Apr 16 18:18:15.910964 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:15.910914 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8wxmz" podStartSLOduration=35.685313028 podStartE2EDuration="38.91089984s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:18:11.836710147 +0000 UTC m=+34.744290154" lastFinishedPulling="2026-04-16 18:18:15.06229696 +0000 UTC m=+37.969876966" observedRunningTime="2026-04-16 18:18:15.910424376 +0000 UTC m=+38.818004399" watchObservedRunningTime="2026-04-16 18:18:15.91089984 +0000 UTC m=+38.818479871" Apr 16 18:18:18.243760 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:18.243713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:18.244118 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:18.243851 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:18.244118 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:18.243911 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:26.243893568 +0000 UTC m=+49.151473594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:18.344449 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:18.344409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:18.344583 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:18.344547 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:18.344624 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:18.344603 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:26.344587803 +0000 UTC m=+49.252167810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:18.848542 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:18.848509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:18.866809 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:18.866777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d0db8fbf-eebe-4eb1-84d2-ef97f04477fa-original-pull-secret\") pod \"global-pull-secret-syncer-nl9sg\" (UID: \"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa\") " pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:19.083113 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:19.083073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nl9sg" Apr 16 18:18:19.199271 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:19.199238 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nl9sg"] Apr 16 18:18:19.202494 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:18:19.202455 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0db8fbf_eebe_4eb1_84d2_ef97f04477fa.slice/crio-4e6c9217446bf286ab276e78f20ed4c8315fe0f71db30d51cdf0bf15e67ae7ab WatchSource:0}: Error finding container 4e6c9217446bf286ab276e78f20ed4c8315fe0f71db30d51cdf0bf15e67ae7ab: Status 404 returned error can't find the container with id 4e6c9217446bf286ab276e78f20ed4c8315fe0f71db30d51cdf0bf15e67ae7ab Apr 16 18:18:19.881519 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:19.881482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nl9sg" event={"ID":"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa","Type":"ContainerStarted","Data":"4e6c9217446bf286ab276e78f20ed4c8315fe0f71db30d51cdf0bf15e67ae7ab"} Apr 16 18:18:23.890237 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:23.890140 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nl9sg" event={"ID":"d0db8fbf-eebe-4eb1-84d2-ef97f04477fa","Type":"ContainerStarted","Data":"6885aa02edad7a06b3c8a3655e72a6c257423ce2949e6e34bac498df88e1ddbd"} Apr 16 18:18:23.907628 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:23.907574 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nl9sg" podStartSLOduration=32.490254261 podStartE2EDuration="36.907559229s" podCreationTimestamp="2026-04-16 18:17:47 +0000 UTC" firstStartedPulling="2026-04-16 18:18:19.20425462 +0000 UTC m=+42.111834642" lastFinishedPulling="2026-04-16 18:18:23.621559599 +0000 UTC m=+46.529139610" observedRunningTime="2026-04-16 18:18:23.906918866 +0000 UTC m=+46.814498907" watchObservedRunningTime="2026-04-16 18:18:23.907559229 +0000 UTC m=+46.815139271" Apr 16 18:18:26.303748 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:26.303704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:26.304133 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:26.303820 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:26.304133 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:26.303880 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:42.303863391 +0000 UTC m=+65.211443419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:26.404410 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:26.404364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:26.404587 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:26.404562 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:26.404661 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:26.404649 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:42.404626219 +0000 UTC m=+65.312206232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:35.861208 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:35.861170 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zkthf" Apr 16 18:18:42.318182 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:42.318136 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:18:42.318600 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:42.318270 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:42.318600 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:42.318343 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:14.318327889 +0000 UTC m=+97.225907895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:18:42.419176 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:42.419142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:18:42.419296 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:42.419277 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:42.419351 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:42.419341 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:14.419326148 +0000 UTC m=+97.326906155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:18:43.324534 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:43.324492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:18:43.325025 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:43.324648 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:43.325025 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:18:43.324755 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:47.324725254 +0000 UTC m=+130.232305263 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : secret "metrics-daemon-secret" not found Apr 16 18:18:46.876991 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:18:46.876954 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8wxmz" Apr 16 18:19:14.338252 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:14.338215 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:19:14.338594 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:14.338366 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:19:14.338594 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:14.338457 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls podName:15e80e92-2f17-4ce6-a0cc-2073e197b9c2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:18.338437997 +0000 UTC m=+161.246018005 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls") pod "dns-default-mh6jk" (UID: "15e80e92-2f17-4ce6-a0cc-2073e197b9c2") : secret "dns-default-metrics-tls" not found Apr 16 18:19:14.438897 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:14.438859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:19:14.439053 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:14.438967 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:19:14.439053 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:14.439024 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert podName:d06241a9-55bd-4260-9855-06114156f4d2 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:18.439010888 +0000 UTC m=+161.346590896 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert") pod "ingress-canary-t9hjb" (UID: "d06241a9-55bd-4260-9855-06114156f4d2") : secret "canary-serving-cert" not found Apr 16 18:19:47.362429 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:47.362359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:19:47.362910 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:47.362527 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:47.362910 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:47.362607 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs podName:a84444e3-6cab-4290-a61c-c01132150e31 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:49.362589999 +0000 UTC m=+252.270170005 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs") pod "network-metrics-daemon-h2jgj" (UID: "a84444e3-6cab-4290-a61c-c01132150e31") : secret "metrics-daemon-secret" not found Apr 16 18:19:56.948620 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.948588 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x"] Apr 16 18:19:56.951226 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.951205 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-hjml7"] Apr 16 18:19:56.951352 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.951319 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:56.953725 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.953706 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9"] Apr 16 18:19:56.953843 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.953828 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:56.956468 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.956447 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-6tvbf"] Apr 16 18:19:56.956616 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.956600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" Apr 16 18:19:56.956745 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.956704 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:19:56.956823 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.956762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:19:56.957030 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957014 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:56.957148 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957045 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:56.957575 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957554 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:19:56.957667 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957555 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:56.957667 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957656 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-2lcjx\"" Apr 16 18:19:56.957829 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.957816 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-wc7mm\"" Apr 16 18:19:56.958562 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.958543 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:56.959490 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.959475 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:56.959898 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.959881 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5665g\"" Apr 16 18:19:56.962484 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.962468 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:19:56.962586 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.962505 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:19:56.962586 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.962529 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:19:56.962586 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.962582 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:19:56.963000 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.962984 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xpcvj\"" Apr 16 18:19:56.970984 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.970941 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-hjml7"] Apr 16 18:19:56.972292 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.972271 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x"] Apr 16 18:19:56.974363 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.974342 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:19:56.974470 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.974423 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:19:56.980201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.980181 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9"] Apr 16 18:19:56.984535 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:56.984498 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-6tvbf"] Apr 16 18:19:57.025102 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmg4\" (UniqueName: \"kubernetes.io/projected/ffe01856-0cc9-447f-901b-d95c5767d43b-kube-api-access-sxmg4\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.025102 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw4np\" (UniqueName: \"kubernetes.io/projected/f3230a09-30be-4152-ac36-65d7911245a2-kube-api-access-bw4np\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-snapshots\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025257 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t42ld\" (UniqueName: \"kubernetes.io/projected/f8f3ed9b-bbc2-445b-9380-064dc07640d5-kube-api-access-t42ld\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025275 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrxp6\" (UniqueName: \"kubernetes.io/projected/96b64fed-ad15-4e9c-b012-f7c3cac2fcb4-kube-api-access-rrxp6\") pod \"network-check-source-7b678d77c7-rlbq9\" (UID: \"96b64fed-ad15-4e9c-b012-f7c3cac2fcb4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" Apr 16 18:19:57.025311 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025300 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-config\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.025629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3230a09-30be-4152-ac36-65d7911245a2-serving-cert\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.025629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-trusted-ca\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.025629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f3ed9b-bbc2-445b-9380-064dc07640d5-serving-cert\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.025629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.025454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-tmp\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.041615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.041587 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq"] Apr 16 18:19:57.044510 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.044497 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" Apr 16 18:19:57.047716 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.047689 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:19:57.047968 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.047945 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-5qj8v\"" Apr 16 18:19:57.048096 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.048077 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:57.048096 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.047949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:57.050479 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.050461 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.052711 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.052689 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:19:57.052711 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.052706 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:19:57.052865 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.052713 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:19:57.054000 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.053979 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq"] Apr 16 18:19:57.054097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.054008 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sb5br\"" Apr 16 18:19:57.059200 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.059180 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:19:57.063064 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.063044 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:19:57.125970 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.125938 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.125970 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.125970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4mdd\" (UniqueName: \"kubernetes.io/projected/13962cee-2aff-4652-a7e9-530d3125242a-kube-api-access-m4mdd\") pod \"volume-data-source-validator-7d955d5dd4-rdhmq\" (UID: \"13962cee-2aff-4652-a7e9-530d3125242a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw4np\" (UniqueName: \"kubernetes.io/projected/f3230a09-30be-4152-ac36-65d7911245a2-kube-api-access-bw4np\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrxp6\" (UniqueName: \"kubernetes.io/projected/96b64fed-ad15-4e9c-b012-f7c3cac2fcb4-kube-api-access-rrxp6\") pod \"network-check-source-7b678d77c7-rlbq9\" (UID: \"96b64fed-ad15-4e9c-b012-f7c3cac2fcb4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-config\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.126154 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:57.126168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-trusted-ca\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.126518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q85kb\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126518 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.126218 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls podName:ffe01856-0cc9-447f-901b-d95c5767d43b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:57.626199193 +0000 UTC m=+140.533779200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls") pod "cluster-samples-operator-667775844f-99q4x" (UID: "ffe01856-0cc9-447f-901b-d95c5767d43b") : secret "samples-operator-tls" not found Apr 16 18:19:57.126518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.126518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.126518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmg4\" (UniqueName: \"kubernetes.io/projected/ffe01856-0cc9-447f-901b-d95c5767d43b-kube-api-access-sxmg4\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-snapshots\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t42ld\" (UniqueName: \"kubernetes.io/projected/f8f3ed9b-bbc2-445b-9380-064dc07640d5-kube-api-access-t42ld\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3230a09-30be-4152-ac36-65d7911245a2-serving-cert\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.126730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f3ed9b-bbc2-445b-9380-064dc07640d5-serving-cert\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127011 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-tmp\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127011 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.126767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.127119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-config\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.127119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127219 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127159 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-snapshots\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127283 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8f3ed9b-bbc2-445b-9380-064dc07640d5-tmp\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127283 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8f3ed9b-bbc2-445b-9380-064dc07640d5-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.127375 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.127331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3230a09-30be-4152-ac36-65d7911245a2-trusted-ca\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.129212 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.129191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8f3ed9b-bbc2-445b-9380-064dc07640d5-serving-cert\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.129212 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.129209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3230a09-30be-4152-ac36-65d7911245a2-serving-cert\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.136766 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.136732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrxp6\" (UniqueName: \"kubernetes.io/projected/96b64fed-ad15-4e9c-b012-f7c3cac2fcb4-kube-api-access-rrxp6\") pod \"network-check-source-7b678d77c7-rlbq9\" (UID: \"96b64fed-ad15-4e9c-b012-f7c3cac2fcb4\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" Apr 16 18:19:57.136899 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.136808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw4np\" (UniqueName: \"kubernetes.io/projected/f3230a09-30be-4152-ac36-65d7911245a2-kube-api-access-bw4np\") pod \"console-operator-d87b8d5fc-hjml7\" (UID: \"f3230a09-30be-4152-ac36-65d7911245a2\") " pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.137564 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.137450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t42ld\" (UniqueName: \"kubernetes.io/projected/f8f3ed9b-bbc2-445b-9380-064dc07640d5-kube-api-access-t42ld\") pod \"insights-operator-5785d4fcdd-6tvbf\" (UID: \"f8f3ed9b-bbc2-445b-9380-064dc07640d5\") " pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.141379 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.141003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmg4\" (UniqueName: \"kubernetes.io/projected/ffe01856-0cc9-447f-901b-d95c5767d43b-kube-api-access-sxmg4\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.227639 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227639 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227639 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4mdd\" (UniqueName: \"kubernetes.io/projected/13962cee-2aff-4652-a7e9-530d3125242a-kube-api-access-m4mdd\") pod \"volume-data-source-validator-7d955d5dd4-rdhmq\" (UID: \"13962cee-2aff-4652-a7e9-530d3125242a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.227702 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.227722 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84d7cb9877-cnlt9: secret "image-registry-tls" not found Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.227795 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls podName:701b7355-0fb2-412e-9af7-09e007fb99bc nodeName:}" failed. No retries permitted until 2026-04-16 18:19:57.727777209 +0000 UTC m=+140.635357216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls") pod "image-registry-84d7cb9877-cnlt9" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc") : secret "image-registry-tls" not found Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227847 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.227901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.227881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q85kb\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.228284 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.228074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.228316 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.228306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.228929 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.228904 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.230127 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.230107 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.230349 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.230331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.237196 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.237175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.237504 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.237484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4mdd\" (UniqueName: \"kubernetes.io/projected/13962cee-2aff-4652-a7e9-530d3125242a-kube-api-access-m4mdd\") pod \"volume-data-source-validator-7d955d5dd4-rdhmq\" (UID: \"13962cee-2aff-4652-a7e9-530d3125242a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" Apr 16 18:19:57.237600 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.237585 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q85kb\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.269460 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.269441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:19:57.278000 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.277908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" Apr 16 18:19:57.282552 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.282534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" Apr 16 18:19:57.353625 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.353536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" Apr 16 18:19:57.422090 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.422056 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-hjml7"] Apr 16 18:19:57.426208 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:19:57.426183 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3230a09_30be_4152_ac36_65d7911245a2.slice/crio-d0bb9a363a83b34e91b688c3bab7194bae9210b5f0bfe077e71cf7124f00314f WatchSource:0}: Error finding container d0bb9a363a83b34e91b688c3bab7194bae9210b5f0bfe077e71cf7124f00314f: Status 404 returned error can't find the container with id d0bb9a363a83b34e91b688c3bab7194bae9210b5f0bfe077e71cf7124f00314f Apr 16 18:19:57.485151 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.485079 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq"] Apr 16 18:19:57.487271 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:19:57.487247 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13962cee_2aff_4652_a7e9_530d3125242a.slice/crio-61f0b57f164a83ed1160a052df1e47c08d7f1597150fbf6d758f5c848c016aeb WatchSource:0}: Error finding container 61f0b57f164a83ed1160a052df1e47c08d7f1597150fbf6d758f5c848c016aeb: Status 404 returned error can't find the container with id 61f0b57f164a83ed1160a052df1e47c08d7f1597150fbf6d758f5c848c016aeb Apr 16 18:19:57.631292 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.631262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:57.631457 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.631387 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:57.631511 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.631479 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls podName:ffe01856-0cc9-447f-901b-d95c5767d43b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:58.631463232 +0000 UTC m=+141.539043240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls") pod "cluster-samples-operator-667775844f-99q4x" (UID: "ffe01856-0cc9-447f-901b-d95c5767d43b") : secret "samples-operator-tls" not found Apr 16 18:19:57.633070 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.633045 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9"] Apr 16 18:19:57.636447 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:19:57.636418 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96b64fed_ad15_4e9c_b012_f7c3cac2fcb4.slice/crio-fc49bea726be26f21a9f9326a199b91df01d57ad2f8a0970efc76f62d3405d28 WatchSource:0}: Error finding container fc49bea726be26f21a9f9326a199b91df01d57ad2f8a0970efc76f62d3405d28: Status 404 returned error can't find the container with id fc49bea726be26f21a9f9326a199b91df01d57ad2f8a0970efc76f62d3405d28 Apr 16 18:19:57.636605 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.636587 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-6tvbf"] Apr 16 18:19:57.639760 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:19:57.639739 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f3ed9b_bbc2_445b_9380_064dc07640d5.slice/crio-e08b618c129a752b79fc001bb8507a45a96188cac6a44bbb19534a0e2ebe0d52 WatchSource:0}: Error finding container e08b618c129a752b79fc001bb8507a45a96188cac6a44bbb19534a0e2ebe0d52: Status 404 returned error can't find the container with id e08b618c129a752b79fc001bb8507a45a96188cac6a44bbb19534a0e2ebe0d52 Apr 16 18:19:57.732501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:57.732471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:57.732671 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.732618 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:57.732671 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.732638 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84d7cb9877-cnlt9: secret "image-registry-tls" not found Apr 16 18:19:57.732772 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:57.732684 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls podName:701b7355-0fb2-412e-9af7-09e007fb99bc nodeName:}" failed. No retries permitted until 2026-04-16 18:19:58.732670807 +0000 UTC m=+141.640250814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls") pod "image-registry-84d7cb9877-cnlt9" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc") : secret "image-registry-tls" not found Apr 16 18:19:58.064023 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.063985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" event={"ID":"96b64fed-ad15-4e9c-b012-f7c3cac2fcb4","Type":"ContainerStarted","Data":"83be32b85ff88410c1133c1872d6692b552bebeb7dffda6d5c5665f194dc1d78"} Apr 16 18:19:58.064510 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.064032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" event={"ID":"96b64fed-ad15-4e9c-b012-f7c3cac2fcb4","Type":"ContainerStarted","Data":"fc49bea726be26f21a9f9326a199b91df01d57ad2f8a0970efc76f62d3405d28"} Apr 16 18:19:58.065481 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.065453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" event={"ID":"f3230a09-30be-4152-ac36-65d7911245a2","Type":"ContainerStarted","Data":"d0bb9a363a83b34e91b688c3bab7194bae9210b5f0bfe077e71cf7124f00314f"} Apr 16 18:19:58.066712 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.066687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" event={"ID":"13962cee-2aff-4652-a7e9-530d3125242a","Type":"ContainerStarted","Data":"61f0b57f164a83ed1160a052df1e47c08d7f1597150fbf6d758f5c848c016aeb"} Apr 16 18:19:58.067722 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.067697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" event={"ID":"f8f3ed9b-bbc2-445b-9380-064dc07640d5","Type":"ContainerStarted","Data":"e08b618c129a752b79fc001bb8507a45a96188cac6a44bbb19534a0e2ebe0d52"} Apr 16 18:19:58.081256 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.081136 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rlbq9" podStartSLOduration=2.081121591 podStartE2EDuration="2.081121591s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:58.080839051 +0000 UTC m=+140.988419076" watchObservedRunningTime="2026-04-16 18:19:58.081121591 +0000 UTC m=+140.988701619" Apr 16 18:19:58.641059 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.641019 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:19:58.641235 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:58.641164 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:58.641310 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:58.641241 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls podName:ffe01856-0cc9-447f-901b-d95c5767d43b nodeName:}" failed. No retries permitted until 2026-04-16 18:20:00.641220706 +0000 UTC m=+143.548800713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls") pod "cluster-samples-operator-667775844f-99q4x" (UID: "ffe01856-0cc9-447f-901b-d95c5767d43b") : secret "samples-operator-tls" not found Apr 16 18:19:58.742493 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:58.741882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:19:58.742493 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:58.742033 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:58.742493 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:58.742048 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84d7cb9877-cnlt9: secret "image-registry-tls" not found Apr 16 18:19:58.742493 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:19:58.742119 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls podName:701b7355-0fb2-412e-9af7-09e007fb99bc nodeName:}" failed. No retries permitted until 2026-04-16 18:20:00.742099691 +0000 UTC m=+143.649679711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls") pod "image-registry-84d7cb9877-cnlt9" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc") : secret "image-registry-tls" not found Apr 16 18:19:59.071556 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:59.071514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" event={"ID":"13962cee-2aff-4652-a7e9-530d3125242a","Type":"ContainerStarted","Data":"912f126bcd24223b02ba5d6740a890ff106187442ac4f89a33760904b363cb67"} Apr 16 18:19:59.088056 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:19:59.088000 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-rdhmq" podStartSLOduration=0.648535312 podStartE2EDuration="2.087984778s" podCreationTimestamp="2026-04-16 18:19:57 +0000 UTC" firstStartedPulling="2026-04-16 18:19:57.48901428 +0000 UTC m=+140.396594287" lastFinishedPulling="2026-04-16 18:19:58.928463746 +0000 UTC m=+141.836043753" observedRunningTime="2026-04-16 18:19:59.087194543 +0000 UTC m=+141.994774584" watchObservedRunningTime="2026-04-16 18:19:59.087984778 +0000 UTC m=+141.995564807" Apr 16 18:20:00.658454 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:00.658411 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:20:00.658947 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:00.658562 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:20:00.658947 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:00.658627 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls podName:ffe01856-0cc9-447f-901b-d95c5767d43b nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.658611762 +0000 UTC m=+147.566191768 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls") pod "cluster-samples-operator-667775844f-99q4x" (UID: "ffe01856-0cc9-447f-901b-d95c5767d43b") : secret "samples-operator-tls" not found Apr 16 18:20:00.759174 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:00.759134 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:00.759354 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:00.759252 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:20:00.759354 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:00.759264 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84d7cb9877-cnlt9: secret "image-registry-tls" not found Apr 16 18:20:00.759354 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:00.759314 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls podName:701b7355-0fb2-412e-9af7-09e007fb99bc nodeName:}" failed. No retries permitted until 2026-04-16 18:20:04.759299471 +0000 UTC m=+147.666879478 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls") pod "image-registry-84d7cb9877-cnlt9" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc") : secret "image-registry-tls" not found Apr 16 18:20:01.077043 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.077003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" event={"ID":"f8f3ed9b-bbc2-445b-9380-064dc07640d5","Type":"ContainerStarted","Data":"221c969e9521c17ce06ff974e4846af995a0d85290779a169d6fd82e7b337e56"} Apr 16 18:20:01.078297 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.078277 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/0.log" Apr 16 18:20:01.078388 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.078310 2579 generic.go:358] "Generic (PLEG): container finished" podID="f3230a09-30be-4152-ac36-65d7911245a2" containerID="9af1dd980e5ef53f63e376c0f1b91bb61bcb8991728ecd2f089e986b7a02827c" exitCode=255 Apr 16 18:20:01.078388 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.078338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" event={"ID":"f3230a09-30be-4152-ac36-65d7911245a2","Type":"ContainerDied","Data":"9af1dd980e5ef53f63e376c0f1b91bb61bcb8991728ecd2f089e986b7a02827c"} Apr 16 18:20:01.078630 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.078613 2579 scope.go:117] "RemoveContainer" containerID="9af1dd980e5ef53f63e376c0f1b91bb61bcb8991728ecd2f089e986b7a02827c" Apr 16 18:20:01.094795 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:01.094752 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" podStartSLOduration=2.658607826 podStartE2EDuration="5.094735426s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="2026-04-16 18:19:57.641327594 +0000 UTC m=+140.548907604" lastFinishedPulling="2026-04-16 18:20:00.077455185 +0000 UTC m=+142.985035204" observedRunningTime="2026-04-16 18:20:01.093676039 +0000 UTC m=+144.001256107" watchObservedRunningTime="2026-04-16 18:20:01.094735426 +0000 UTC m=+144.002315455" Apr 16 18:20:02.084776 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.084750 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/1.log" Apr 16 18:20:02.085154 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.085101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/0.log" Apr 16 18:20:02.085154 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.085131 2579 generic.go:358] "Generic (PLEG): container finished" podID="f3230a09-30be-4152-ac36-65d7911245a2" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" exitCode=255 Apr 16 18:20:02.085269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.085159 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" event={"ID":"f3230a09-30be-4152-ac36-65d7911245a2","Type":"ContainerDied","Data":"a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac"} Apr 16 18:20:02.085269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.085197 2579 scope.go:117] "RemoveContainer" containerID="9af1dd980e5ef53f63e376c0f1b91bb61bcb8991728ecd2f089e986b7a02827c" Apr 16 18:20:02.085703 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:02.085681 2579 scope.go:117] "RemoveContainer" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" Apr 16 18:20:02.085896 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:02.085878 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:03.089555 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:03.089526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/1.log" Apr 16 18:20:03.089923 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:03.089863 2579 scope.go:117] "RemoveContainer" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" Apr 16 18:20:03.090053 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:03.090036 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:03.595353 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:03.595326 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sfjlw_0c49a704-d49e-48e8-a3bf-2cbbf59da5a5/dns-node-resolver/0.log" Apr 16 18:20:04.195520 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:04.195497 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g2bjm_a3216a58-ad89-4814-b5b4-7ae5bf98510e/node-ca/0.log" Apr 16 18:20:04.689041 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:04.689005 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:20:04.689217 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:04.689152 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:20:04.689301 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:04.689228 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls podName:ffe01856-0cc9-447f-901b-d95c5767d43b nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.689212011 +0000 UTC m=+155.596792018 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls") pod "cluster-samples-operator-667775844f-99q4x" (UID: "ffe01856-0cc9-447f-901b-d95c5767d43b") : secret "samples-operator-tls" not found Apr 16 18:20:04.790350 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:04.790319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:04.790524 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:04.790450 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:20:04.790524 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:04.790461 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-84d7cb9877-cnlt9: secret "image-registry-tls" not found Apr 16 18:20:04.790524 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:04.790509 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls podName:701b7355-0fb2-412e-9af7-09e007fb99bc nodeName:}" failed. No retries permitted until 2026-04-16 18:20:12.790495875 +0000 UTC m=+155.698075881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls") pod "image-registry-84d7cb9877-cnlt9" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc") : secret "image-registry-tls" not found Apr 16 18:20:07.269871 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:07.269815 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:20:07.269871 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:07.269873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:20:07.270290 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:07.270217 2579 scope.go:117] "RemoveContainer" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" Apr 16 18:20:07.270423 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:07.270383 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:12.755012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.754961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:20:12.757277 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.757256 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffe01856-0cc9-447f-901b-d95c5767d43b-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-99q4x\" (UID: \"ffe01856-0cc9-447f-901b-d95c5767d43b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:20:12.856233 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.856203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:12.858308 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.858286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"image-registry-84d7cb9877-cnlt9\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:12.861195 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.861173 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" Apr 16 18:20:12.961693 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.961663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:12.973787 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:12.973715 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x"] Apr 16 18:20:13.083656 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:13.083625 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:20:13.086346 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:13.086321 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701b7355_0fb2_412e_9af7_09e007fb99bc.slice/crio-d9f02e49481fb331c2e010e83545afad27f18690e16d11e19bf69ad6d93bb0af WatchSource:0}: Error finding container d9f02e49481fb331c2e010e83545afad27f18690e16d11e19bf69ad6d93bb0af: Status 404 returned error can't find the container with id d9f02e49481fb331c2e010e83545afad27f18690e16d11e19bf69ad6d93bb0af Apr 16 18:20:13.113606 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:13.113580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" event={"ID":"701b7355-0fb2-412e-9af7-09e007fb99bc","Type":"ContainerStarted","Data":"d9f02e49481fb331c2e010e83545afad27f18690e16d11e19bf69ad6d93bb0af"} Apr 16 18:20:13.114456 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:13.114435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" event={"ID":"ffe01856-0cc9-447f-901b-d95c5767d43b","Type":"ContainerStarted","Data":"c7ce04e827920b42ffc5842a9a3ee92665dda2180c3344598e6f3e3e3bf12e81"} Apr 16 18:20:13.531576 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:13.531533 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mh6jk" podUID="15e80e92-2f17-4ce6-a0cc-2073e197b9c2" Apr 16 18:20:13.545671 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:13.545640 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t9hjb" podUID="d06241a9-55bd-4260-9855-06114156f4d2" Apr 16 18:20:13.682197 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:13.682157 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-h2jgj" podUID="a84444e3-6cab-4290-a61c-c01132150e31" Apr 16 18:20:14.118495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:14.118461 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" event={"ID":"701b7355-0fb2-412e-9af7-09e007fb99bc","Type":"ContainerStarted","Data":"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767"} Apr 16 18:20:14.118927 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:14.118510 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:14.118927 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:14.118619 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:14.149889 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:14.149843 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" podStartSLOduration=17.14982902 podStartE2EDuration="17.14982902s" podCreationTimestamp="2026-04-16 18:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:14.149559211 +0000 UTC m=+157.057139241" watchObservedRunningTime="2026-04-16 18:20:14.14982902 +0000 UTC m=+157.057409049" Apr 16 18:20:15.122263 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:15.122220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" event={"ID":"ffe01856-0cc9-447f-901b-d95c5767d43b","Type":"ContainerStarted","Data":"07a5a4e889c44528134042457bd6f517f3e4265c9ecf7a15a195f73f727fbe57"} Apr 16 18:20:15.122263 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:15.122263 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" event={"ID":"ffe01856-0cc9-447f-901b-d95c5767d43b","Type":"ContainerStarted","Data":"9c30811664fe74e668433e9921a57be21d6048d97402a9728b88bed55ddb7faa"} Apr 16 18:20:15.145845 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:15.145798 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-99q4x" podStartSLOduration=17.597748432 podStartE2EDuration="19.145784867s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="2026-04-16 18:20:13.023281923 +0000 UTC m=+155.930861934" lastFinishedPulling="2026-04-16 18:20:14.571318363 +0000 UTC m=+157.478898369" observedRunningTime="2026-04-16 18:20:15.145383824 +0000 UTC m=+158.052963866" watchObservedRunningTime="2026-04-16 18:20:15.145784867 +0000 UTC m=+158.053364895" Apr 16 18:20:18.398785 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.398744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:18.400993 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.400973 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/15e80e92-2f17-4ce6-a0cc-2073e197b9c2-metrics-tls\") pod \"dns-default-mh6jk\" (UID: \"15e80e92-2f17-4ce6-a0cc-2073e197b9c2\") " pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:18.499806 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.499770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:20:18.502036 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.502011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d06241a9-55bd-4260-9855-06114156f4d2-cert\") pod \"ingress-canary-t9hjb\" (UID: \"d06241a9-55bd-4260-9855-06114156f4d2\") " pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:20:18.622155 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.622125 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8xm9h\"" Apr 16 18:20:18.630166 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.630147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:18.742327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:18.742303 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mh6jk"] Apr 16 18:20:18.745091 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:18.745064 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e80e92_2f17_4ce6_a0cc_2073e197b9c2.slice/crio-e3e5ba1bec06c15626711ea2f0977bafb05256d8f2f5504c25ab4865e3b6cbb2 WatchSource:0}: Error finding container e3e5ba1bec06c15626711ea2f0977bafb05256d8f2f5504c25ab4865e3b6cbb2: Status 404 returned error can't find the container with id e3e5ba1bec06c15626711ea2f0977bafb05256d8f2f5504c25ab4865e3b6cbb2 Apr 16 18:20:19.134751 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:19.134717 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mh6jk" event={"ID":"15e80e92-2f17-4ce6-a0cc-2073e197b9c2","Type":"ContainerStarted","Data":"e3e5ba1bec06c15626711ea2f0977bafb05256d8f2f5504c25ab4865e3b6cbb2"} Apr 16 18:20:21.141454 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:21.141419 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mh6jk" event={"ID":"15e80e92-2f17-4ce6-a0cc-2073e197b9c2","Type":"ContainerStarted","Data":"ac555a281a99b640f434dc62cab4f69e6d4aa17c8859a418fc0e91d3c0bd515b"} Apr 16 18:20:21.141454 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:21.141451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mh6jk" event={"ID":"15e80e92-2f17-4ce6-a0cc-2073e197b9c2","Type":"ContainerStarted","Data":"791f27392e1be931907637d726bc7d82859012c74e521b89f2e9d40e722a1c17"} Apr 16 18:20:21.141849 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:21.141526 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:21.160297 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:21.160259 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mh6jk" podStartSLOduration=129.824656241 podStartE2EDuration="2m11.160247763s" podCreationTimestamp="2026-04-16 18:18:10 +0000 UTC" firstStartedPulling="2026-04-16 18:20:18.747472993 +0000 UTC m=+161.655053001" lastFinishedPulling="2026-04-16 18:20:20.083064516 +0000 UTC m=+162.990644523" observedRunningTime="2026-04-16 18:20:21.158952038 +0000 UTC m=+164.066532066" watchObservedRunningTime="2026-04-16 18:20:21.160247763 +0000 UTC m=+164.067827830" Apr 16 18:20:22.665082 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:22.665050 2579 scope.go:117] "RemoveContainer" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" Apr 16 18:20:23.148665 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.148639 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:20:23.148999 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.148983 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/1.log" Apr 16 18:20:23.149048 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.149020 2579 generic.go:358] "Generic (PLEG): container finished" podID="f3230a09-30be-4152-ac36-65d7911245a2" containerID="cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e" exitCode=255 Apr 16 18:20:23.149112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.149098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" event={"ID":"f3230a09-30be-4152-ac36-65d7911245a2","Type":"ContainerDied","Data":"cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e"} Apr 16 18:20:23.149145 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.149134 2579 scope.go:117] "RemoveContainer" containerID="a5f9d215265298590da5c47aaa7374c6e0096f9c4968a9abf2e35f26818200ac" Apr 16 18:20:23.149516 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.149499 2579 scope.go:117] "RemoveContainer" containerID="cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e" Apr 16 18:20:23.149697 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:23.149680 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:23.465617 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.465538 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:20:23.500010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.499983 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57dcf45987-66g2j"] Apr 16 18:20:23.503951 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.503934 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.545599 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.545569 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57dcf45987-66g2j"] Apr 16 18:20:23.589650 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.589624 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lss66"] Apr 16 18:20:23.592729 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.592714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.597068 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.597046 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4fhn9\"" Apr 16 18:20:23.597451 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.597376 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:23.597451 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.597431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:23.604569 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.604538 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lss66"] Apr 16 18:20:23.634981 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.634954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-certificates\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.634985 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-bound-sa-token\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lrkh\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-kube-api-access-4lrkh\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-image-registry-private-configuration\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635084 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-tls\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24dc120e-24c7-4a98-a3ca-c4e002937a7b-ca-trust-extracted\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-installation-pull-secrets\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.635215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.635206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-trusted-ca\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.736495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-installation-pull-secrets\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.736495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4466154-2f02-4daf-a450-7d13e7879820-crio-socket\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.736495 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4466154-2f02-4daf-a450-7d13e7879820-data-volume\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-trusted-ca\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4466154-2f02-4daf-a450-7d13e7879820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4466154-2f02-4daf-a450-7d13e7879820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-certificates\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-bound-sa-token\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lrkh\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-kube-api-access-4lrkh\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-image-registry-private-configuration\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.736723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-tls\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.737020 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8mg6\" (UniqueName: \"kubernetes.io/projected/a4466154-2f02-4daf-a450-7d13e7879820-kube-api-access-t8mg6\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.737112 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.737083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24dc120e-24c7-4a98-a3ca-c4e002937a7b-ca-trust-extracted\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.737472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/24dc120e-24c7-4a98-a3ca-c4e002937a7b-ca-trust-extracted\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.737615 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.737555 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-certificates\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.738331 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.738304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24dc120e-24c7-4a98-a3ca-c4e002937a7b-trusted-ca\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.739482 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.739447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-installation-pull-secrets\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.739592 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.739517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-registry-tls\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.739642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.739587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/24dc120e-24c7-4a98-a3ca-c4e002937a7b-image-registry-private-configuration\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.747420 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.747377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lrkh\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-kube-api-access-4lrkh\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.747500 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.747459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24dc120e-24c7-4a98-a3ca-c4e002937a7b-bound-sa-token\") pod \"image-registry-57dcf45987-66g2j\" (UID: \"24dc120e-24c7-4a98-a3ca-c4e002937a7b\") " pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.814560 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.814528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:23.837969 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.837942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8mg6\" (UniqueName: \"kubernetes.io/projected/a4466154-2f02-4daf-a450-7d13e7879820-kube-api-access-t8mg6\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.837988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4466154-2f02-4daf-a450-7d13e7879820-crio-socket\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838006 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4466154-2f02-4daf-a450-7d13e7879820-data-volume\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a4466154-2f02-4daf-a450-7d13e7879820-crio-socket\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838232 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838119 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4466154-2f02-4daf-a450-7d13e7879820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838232 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838157 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4466154-2f02-4daf-a450-7d13e7879820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838330 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a4466154-2f02-4daf-a450-7d13e7879820-data-volume\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.838666 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.838649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a4466154-2f02-4daf-a450-7d13e7879820-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.840283 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.840259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a4466154-2f02-4daf-a450-7d13e7879820-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.848403 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.848379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8mg6\" (UniqueName: \"kubernetes.io/projected/a4466154-2f02-4daf-a450-7d13e7879820-kube-api-access-t8mg6\") pod \"insights-runtime-extractor-lss66\" (UID: \"a4466154-2f02-4daf-a450-7d13e7879820\") " pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.901945 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.901917 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lss66" Apr 16 18:20:23.933920 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:23.933887 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57dcf45987-66g2j"] Apr 16 18:20:23.937817 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:23.937761 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24dc120e_24c7_4a98_a3ca_c4e002937a7b.slice/crio-4ed2dfc4b2dc54d1bc4e8ac3500338a9f700ca817147efcc953b2968610ce677 WatchSource:0}: Error finding container 4ed2dfc4b2dc54d1bc4e8ac3500338a9f700ca817147efcc953b2968610ce677: Status 404 returned error can't find the container with id 4ed2dfc4b2dc54d1bc4e8ac3500338a9f700ca817147efcc953b2968610ce677 Apr 16 18:20:24.023927 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.023894 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lss66"] Apr 16 18:20:24.028268 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:24.028235 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4466154_2f02_4daf_a450_7d13e7879820.slice/crio-eab3cc782e98a5575716c60aae566251d94ec03c05b2d6156157df3da5480f71 WatchSource:0}: Error finding container eab3cc782e98a5575716c60aae566251d94ec03c05b2d6156157df3da5480f71: Status 404 returned error can't find the container with id eab3cc782e98a5575716c60aae566251d94ec03c05b2d6156157df3da5480f71 Apr 16 18:20:24.152923 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.152891 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:20:24.154270 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.154244 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" event={"ID":"24dc120e-24c7-4a98-a3ca-c4e002937a7b","Type":"ContainerStarted","Data":"6bf95c19b3b1e693f1a1e7d4f6abf403e9e1ba6677daa58d84cbe9e44f0e2481"} Apr 16 18:20:24.154417 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.154274 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" event={"ID":"24dc120e-24c7-4a98-a3ca-c4e002937a7b","Type":"ContainerStarted","Data":"4ed2dfc4b2dc54d1bc4e8ac3500338a9f700ca817147efcc953b2968610ce677"} Apr 16 18:20:24.154500 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.154428 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:24.155503 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.155479 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lss66" event={"ID":"a4466154-2f02-4daf-a450-7d13e7879820","Type":"ContainerStarted","Data":"420ddcd4510e4d14d1cf5c5bfec8e7af489b72357faea4c9c12a45306bb39704"} Apr 16 18:20:24.155605 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.155507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lss66" event={"ID":"a4466154-2f02-4daf-a450-7d13e7879820","Type":"ContainerStarted","Data":"eab3cc782e98a5575716c60aae566251d94ec03c05b2d6156157df3da5480f71"} Apr 16 18:20:24.174229 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:24.174171 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" podStartSLOduration=1.174159236 podStartE2EDuration="1.174159236s" podCreationTimestamp="2026-04-16 18:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:20:24.17366085 +0000 UTC m=+167.081240878" watchObservedRunningTime="2026-04-16 18:20:24.174159236 +0000 UTC m=+167.081739265" Apr 16 18:20:25.160360 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:25.160321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lss66" event={"ID":"a4466154-2f02-4daf-a450-7d13e7879820","Type":"ContainerStarted","Data":"a40708c1a81990f2cb18ae4f1af5a760709d53ad554610af77a6b696953d49b2"} Apr 16 18:20:26.172190 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:26.172150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lss66" event={"ID":"a4466154-2f02-4daf-a450-7d13e7879820","Type":"ContainerStarted","Data":"db5a42966af22c7877039f92c5d19cc2407b29d0a5d789dfe92f7ebea7e735cd"} Apr 16 18:20:26.191187 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:26.191144 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lss66" podStartSLOduration=1.267381143 podStartE2EDuration="3.191131829s" podCreationTimestamp="2026-04-16 18:20:23 +0000 UTC" firstStartedPulling="2026-04-16 18:20:24.083984744 +0000 UTC m=+166.991564751" lastFinishedPulling="2026-04-16 18:20:26.007735428 +0000 UTC m=+168.915315437" observedRunningTime="2026-04-16 18:20:26.191016429 +0000 UTC m=+169.098596458" watchObservedRunningTime="2026-04-16 18:20:26.191131829 +0000 UTC m=+169.098711857" Apr 16 18:20:27.270335 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.270307 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:20:27.270335 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.270340 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:20:27.270717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.270647 2579 scope.go:117] "RemoveContainer" containerID="cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e" Apr 16 18:20:27.270847 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:27.270830 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:27.665935 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.665846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:20:27.666109 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.665989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:20:27.668651 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.668632 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rxx2m\"" Apr 16 18:20:27.676717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.676695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t9hjb" Apr 16 18:20:27.795748 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:27.795717 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t9hjb"] Apr 16 18:20:27.798975 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:27.798951 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd06241a9_55bd_4260_9855_06114156f4d2.slice/crio-a8f318a9f531d88f3204e99ca22cd8f34ba80a066c8b1c61a1fd43f94e260e29 WatchSource:0}: Error finding container a8f318a9f531d88f3204e99ca22cd8f34ba80a066c8b1c61a1fd43f94e260e29: Status 404 returned error can't find the container with id a8f318a9f531d88f3204e99ca22cd8f34ba80a066c8b1c61a1fd43f94e260e29 Apr 16 18:20:28.177822 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:28.177786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t9hjb" event={"ID":"d06241a9-55bd-4260-9855-06114156f4d2","Type":"ContainerStarted","Data":"a8f318a9f531d88f3204e99ca22cd8f34ba80a066c8b1c61a1fd43f94e260e29"} Apr 16 18:20:30.187497 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:30.187462 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t9hjb" event={"ID":"d06241a9-55bd-4260-9855-06114156f4d2","Type":"ContainerStarted","Data":"3f7e2d988c141428b9cb8be5ba202e681a848fc3d92ca5cd192a7dd10a963f97"} Apr 16 18:20:30.206436 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:30.206370 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t9hjb" podStartSLOduration=138.613291763 podStartE2EDuration="2m20.206358262s" podCreationTimestamp="2026-04-16 18:18:10 +0000 UTC" firstStartedPulling="2026-04-16 18:20:27.800726614 +0000 UTC m=+170.708306622" lastFinishedPulling="2026-04-16 18:20:29.393793112 +0000 UTC m=+172.301373121" observedRunningTime="2026-04-16 18:20:30.20618894 +0000 UTC m=+173.113768970" watchObservedRunningTime="2026-04-16 18:20:30.206358262 +0000 UTC m=+173.113938290" Apr 16 18:20:31.145916 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:31.145889 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mh6jk" Apr 16 18:20:33.470926 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.470886 2579 patch_prober.go:28] interesting pod/image-registry-84d7cb9877-cnlt9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:20:33.471288 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.470944 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:20:33.696837 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.696809 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-8s6wd"] Apr 16 18:20:33.713220 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.713196 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-8s6wd"] Apr 16 18:20:33.713360 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.713306 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.715638 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.715611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:20:33.715770 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.715726 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:20:33.716506 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.716484 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:20:33.716708 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.716508 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:20:33.716708 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.716507 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-28pwl\"" Apr 16 18:20:33.716708 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.716528 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:33.808848 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.808821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.808998 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.808860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.808998 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.808922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.808998 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.808957 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n2g\" (UniqueName: \"kubernetes.io/projected/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-kube-api-access-g9n2g\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.909287 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.909261 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.909431 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.909299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.909431 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.909335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.909431 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.909371 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n2g\" (UniqueName: \"kubernetes.io/projected/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-kube-api-access-g9n2g\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.910036 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.910013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-metrics-client-ca\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.911760 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.911737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.911873 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.911789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:33.919500 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:33.919476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n2g\" (UniqueName: \"kubernetes.io/projected/e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82-kube-api-access-g9n2g\") pod \"prometheus-operator-78f957474d-8s6wd\" (UID: \"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82\") " pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:34.022444 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:34.022413 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" Apr 16 18:20:34.136899 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:34.136853 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-8s6wd"] Apr 16 18:20:34.140509 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:34.140486 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode55f90a5_c3c9_46e3_9ca2_8ce6b1413f82.slice/crio-de360c0554120bf60ca07c5441fc070bf99f49be6dd597a2e4656fe897205abc WatchSource:0}: Error finding container de360c0554120bf60ca07c5441fc070bf99f49be6dd597a2e4656fe897205abc: Status 404 returned error can't find the container with id de360c0554120bf60ca07c5441fc070bf99f49be6dd597a2e4656fe897205abc Apr 16 18:20:34.198541 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:34.198511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" event={"ID":"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82","Type":"ContainerStarted","Data":"de360c0554120bf60ca07c5441fc070bf99f49be6dd597a2e4656fe897205abc"} Apr 16 18:20:36.205366 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:36.205279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" event={"ID":"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82","Type":"ContainerStarted","Data":"7ff1c494bf23bcb3f98d7777c18cebd49c572e47115c4636ef12ba968bf7c9f9"} Apr 16 18:20:36.205366 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:36.205314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" event={"ID":"e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82","Type":"ContainerStarted","Data":"e81067352343dc3d1af4f5d8c6ba2e1985849e54121d9abbc5af639d84cb9216"} Apr 16 18:20:36.226118 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:36.226076 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-8s6wd" podStartSLOduration=1.530097696 podStartE2EDuration="3.226063352s" podCreationTimestamp="2026-04-16 18:20:33 +0000 UTC" firstStartedPulling="2026-04-16 18:20:34.142370989 +0000 UTC m=+177.049950996" lastFinishedPulling="2026-04-16 18:20:35.838336645 +0000 UTC m=+178.745916652" observedRunningTime="2026-04-16 18:20:36.225323722 +0000 UTC m=+179.132903747" watchObservedRunningTime="2026-04-16 18:20:36.226063352 +0000 UTC m=+179.133643381" Apr 16 18:20:38.132252 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.132220 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8vj7t"] Apr 16 18:20:38.136473 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.136452 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.139759 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.139723 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w7vqp\"" Apr 16 18:20:38.139881 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.139867 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:38.140508 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.140484 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:38.141012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.140994 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rhbqn"] Apr 16 18:20:38.141183 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.141168 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:38.144584 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.144568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.146434 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.146386 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:20:38.146745 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.146727 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-j9l5p\"" Apr 16 18:20:38.146819 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.146731 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:20:38.146819 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.146769 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:20:38.157582 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.157562 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rhbqn"] Apr 16 18:20:38.248930 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.248892 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.248930 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.248932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69f60c16-14f7-41f5-af8f-f30635d5ef32-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.248955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-metrics-client-ca\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-wtmp\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-root\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249140 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249110 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-sys\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249260 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-tls\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m52f\" (UniqueName: \"kubernetes.io/projected/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-api-access-5m52f\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-textfile\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.249346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.249330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4tx\" (UniqueName: \"kubernetes.io/projected/f0ad478e-fb4e-41fc-8942-d15c685f82b4-kube-api-access-vk4tx\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.350698 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350660 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.350698 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-wtmp\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-root\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-wtmp\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.350944 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.350938 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-root\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351008 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-sys\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:38.351013 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-tls\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:38.351086 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls podName:69f60c16-14f7-41f5-af8f-f30635d5ef32 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:38.851063399 +0000 UTC m=+181.758643429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-rhbqn" (UID: "69f60c16-14f7-41f5-af8f-f30635d5ef32") : secret "kube-state-metrics-tls" not found Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0ad478e-fb4e-41fc-8942-d15c685f82b4-sys\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351122 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m52f\" (UniqueName: \"kubernetes.io/projected/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-api-access-5m52f\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-textfile\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4tx\" (UniqueName: \"kubernetes.io/projected/f0ad478e-fb4e-41fc-8942-d15c685f82b4-kube-api-access-vk4tx\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351306 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69f60c16-14f7-41f5-af8f-f30635d5ef32-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.351820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-metrics-client-ca\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-accelerators-collector-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.351820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.351820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-textfile\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.352071 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.351909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/69f60c16-14f7-41f5-af8f-f30635d5ef32-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.352127 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.352099 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.352127 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.352106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ad478e-fb4e-41fc-8942-d15c685f82b4-metrics-client-ca\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.353533 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.353514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.354245 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.354219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.354329 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.354293 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f0ad478e-fb4e-41fc-8942-d15c685f82b4-node-exporter-tls\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.360206 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.360188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4tx\" (UniqueName: \"kubernetes.io/projected/f0ad478e-fb4e-41fc-8942-d15c685f82b4-kube-api-access-vk4tx\") pod \"node-exporter-8vj7t\" (UID: \"f0ad478e-fb4e-41fc-8942-d15c685f82b4\") " pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.364133 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.364114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m52f\" (UniqueName: \"kubernetes.io/projected/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-api-access-5m52f\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.446207 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.446117 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8vj7t" Apr 16 18:20:38.454431 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:38.454381 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ad478e_fb4e_41fc_8942_d15c685f82b4.slice/crio-7172d995f5a1f552c262bf6f10a1189d4378562a8298c3ae7dcad2b6f89a0870 WatchSource:0}: Error finding container 7172d995f5a1f552c262bf6f10a1189d4378562a8298c3ae7dcad2b6f89a0870: Status 404 returned error can't find the container with id 7172d995f5a1f552c262bf6f10a1189d4378562a8298c3ae7dcad2b6f89a0870 Apr 16 18:20:38.854760 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.854726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:38.857136 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:38.857110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/69f60c16-14f7-41f5-af8f-f30635d5ef32-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rhbqn\" (UID: \"69f60c16-14f7-41f5-af8f-f30635d5ef32\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:39.052458 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:39.052424 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" Apr 16 18:20:39.214016 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:39.213983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8vj7t" event={"ID":"f0ad478e-fb4e-41fc-8942-d15c685f82b4","Type":"ContainerStarted","Data":"7172d995f5a1f552c262bf6f10a1189d4378562a8298c3ae7dcad2b6f89a0870"} Apr 16 18:20:39.333774 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:39.333749 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rhbqn"] Apr 16 18:20:39.336103 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:39.336080 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f60c16_14f7_41f5_af8f_f30635d5ef32.slice/crio-3ffb0f34f3c9a92d34fd2181f37f6efe748088e27e5f59fe489b47a699a4fc82 WatchSource:0}: Error finding container 3ffb0f34f3c9a92d34fd2181f37f6efe748088e27e5f59fe489b47a699a4fc82: Status 404 returned error can't find the container with id 3ffb0f34f3c9a92d34fd2181f37f6efe748088e27e5f59fe489b47a699a4fc82 Apr 16 18:20:40.219081 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:40.219048 2579 generic.go:358] "Generic (PLEG): container finished" podID="f0ad478e-fb4e-41fc-8942-d15c685f82b4" containerID="bddc3f74997bf88b04c6c8c7ed069095963ed40c01ad592731aab5d6f3ab9ba9" exitCode=0 Apr 16 18:20:40.219566 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:40.219148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8vj7t" event={"ID":"f0ad478e-fb4e-41fc-8942-d15c685f82b4","Type":"ContainerDied","Data":"bddc3f74997bf88b04c6c8c7ed069095963ed40c01ad592731aab5d6f3ab9ba9"} Apr 16 18:20:40.220844 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:40.220816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" event={"ID":"69f60c16-14f7-41f5-af8f-f30635d5ef32","Type":"ContainerStarted","Data":"3ffb0f34f3c9a92d34fd2181f37f6efe748088e27e5f59fe489b47a699a4fc82"} Apr 16 18:20:41.225796 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.225758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" event={"ID":"69f60c16-14f7-41f5-af8f-f30635d5ef32","Type":"ContainerStarted","Data":"351d4863a4add7de7c49cf6c64f2a6545623f90689ba080936e05f368fdec5da"} Apr 16 18:20:41.225796 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.225800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" event={"ID":"69f60c16-14f7-41f5-af8f-f30635d5ef32","Type":"ContainerStarted","Data":"9681a8c03c26d7342d003e59392bbaf50e3dfe7183d66f5a616d9c37ad51ce7a"} Apr 16 18:20:41.226324 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.225815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" event={"ID":"69f60c16-14f7-41f5-af8f-f30635d5ef32","Type":"ContainerStarted","Data":"c1be39884473e0a9a84a7a1159fdbaf90ba8d449b6bb4108ea7ff4078ca65d43"} Apr 16 18:20:41.227608 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.227589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8vj7t" event={"ID":"f0ad478e-fb4e-41fc-8942-d15c685f82b4","Type":"ContainerStarted","Data":"c4c3ab932c09c8194efb42f6480dd344a98536e773c4368cba91bfd7cb674ed3"} Apr 16 18:20:41.227669 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.227614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8vj7t" event={"ID":"f0ad478e-fb4e-41fc-8942-d15c685f82b4","Type":"ContainerStarted","Data":"5a326f5a2e0fb604f3471833b16edf6bbe9368a5991ba2510fa72e08502c0089"} Apr 16 18:20:41.246168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.246114 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-rhbqn" podStartSLOduration=1.888498593 podStartE2EDuration="3.246100049s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:39.338118684 +0000 UTC m=+182.245698691" lastFinishedPulling="2026-04-16 18:20:40.695720139 +0000 UTC m=+183.603300147" observedRunningTime="2026-04-16 18:20:41.245382882 +0000 UTC m=+184.152962922" watchObservedRunningTime="2026-04-16 18:20:41.246100049 +0000 UTC m=+184.153680072" Apr 16 18:20:41.262919 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.262867 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8vj7t" podStartSLOduration=2.461582587 podStartE2EDuration="3.262851624s" podCreationTimestamp="2026-04-16 18:20:38 +0000 UTC" firstStartedPulling="2026-04-16 18:20:38.456106278 +0000 UTC m=+181.363686287" lastFinishedPulling="2026-04-16 18:20:39.257375311 +0000 UTC m=+182.164955324" observedRunningTime="2026-04-16 18:20:41.262251587 +0000 UTC m=+184.169831626" watchObservedRunningTime="2026-04-16 18:20:41.262851624 +0000 UTC m=+184.170431653" Apr 16 18:20:41.668717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:41.668687 2579 scope.go:117] "RemoveContainer" containerID="cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e" Apr 16 18:20:41.668888 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:41.668865 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-hjml7_openshift-console-operator(f3230a09-30be-4152-ac36-65d7911245a2)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podUID="f3230a09-30be-4152-ac36-65d7911245a2" Apr 16 18:20:42.606339 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.606303 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-886754884-tkmnd"] Apr 16 18:20:42.609471 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.609451 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.614213 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614191 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9b6qeolvb0uu4\"" Apr 16 18:20:42.614336 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614314 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:20:42.614449 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614433 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:20:42.614520 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614475 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:20:42.614580 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-k2nwg\"" Apr 16 18:20:42.614580 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.614565 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:20:42.626618 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.626594 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-886754884-tkmnd"] Apr 16 18:20:42.691312 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx92r\" (UniqueName: \"kubernetes.io/projected/929b560f-9d1d-4d0b-be84-3095c605bb4c-kube-api-access-cx92r\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691528 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691321 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-metrics-server-audit-profiles\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691528 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-client-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691528 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691461 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/929b560f-9d1d-4d0b-be84-3095c605bb4c-audit-log\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691528 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-client-certs\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.691736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.691608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-tls\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792164 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792127 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-metrics-server-audit-profiles\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792164 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-client-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792440 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792194 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/929b560f-9d1d-4d0b-be84-3095c605bb4c-audit-log\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792440 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792440 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-client-certs\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792440 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-tls\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792440 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx92r\" (UniqueName: \"kubernetes.io/projected/929b560f-9d1d-4d0b-be84-3095c605bb4c-kube-api-access-cx92r\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.792779 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.792753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/929b560f-9d1d-4d0b-be84-3095c605bb4c-audit-log\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.793053 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.793031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.793358 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.793334 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/929b560f-9d1d-4d0b-be84-3095c605bb4c-metrics-server-audit-profiles\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.794992 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.794967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-tls\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.794992 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.794977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-client-ca-bundle\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.794992 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.794982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/929b560f-9d1d-4d0b-be84-3095c605bb4c-secret-metrics-server-client-certs\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.804346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.804319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx92r\" (UniqueName: \"kubernetes.io/projected/929b560f-9d1d-4d0b-be84-3095c605bb4c-kube-api-access-cx92r\") pod \"metrics-server-886754884-tkmnd\" (UID: \"929b560f-9d1d-4d0b-be84-3095c605bb4c\") " pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:42.919166 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:42.919072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:20:43.045864 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:43.045827 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-886754884-tkmnd"] Apr 16 18:20:43.048539 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:43.048515 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod929b560f_9d1d_4d0b_be84_3095c605bb4c.slice/crio-19387daa85e2c6db8b5d5d8b80d2a9796a8a99ff5bd5e67ae727186be60b0071 WatchSource:0}: Error finding container 19387daa85e2c6db8b5d5d8b80d2a9796a8a99ff5bd5e67ae727186be60b0071: Status 404 returned error can't find the container with id 19387daa85e2c6db8b5d5d8b80d2a9796a8a99ff5bd5e67ae727186be60b0071 Apr 16 18:20:43.235108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:43.235009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-886754884-tkmnd" event={"ID":"929b560f-9d1d-4d0b-be84-3095c605bb4c","Type":"ContainerStarted","Data":"19387daa85e2c6db8b5d5d8b80d2a9796a8a99ff5bd5e67ae727186be60b0071"} Apr 16 18:20:43.470559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:43.470526 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:44.374630 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.374055 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:44.382037 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.382010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.384912 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.384726 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5oq1hdcp6ot0i\"" Apr 16 18:20:44.384912 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.384752 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:20:44.385085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.384930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sqd8w\"" Apr 16 18:20:44.385085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.384947 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:20:44.385085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.385009 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:20:44.385085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.385067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:20:44.385606 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.385585 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:20:44.385741 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.385639 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:20:44.386142 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.386122 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:20:44.386493 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.386466 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:20:44.392437 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.387224 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:20:44.392437 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.387480 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:20:44.392437 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.390139 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:44.403661 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.403626 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:20:44.404256 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.404219 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:20:44.507361 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507317 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507682 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507682 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507682 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507836 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507686 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507836 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507836 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.507836 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg99v\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507957 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508022 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.507994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.508049 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.508078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.508104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.508201 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.508178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609604 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609753 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609616 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609753 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609753 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609753 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.609753 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg99v\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610010 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.609991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610508 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.610033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610677 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.610654 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.610928 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.610908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.612048 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.611710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.613227 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.612999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.615635 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.615610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.615999 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.615978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.617697 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.617648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.618814 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.618814 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.618814 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.618814 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618742 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.618814 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.619102 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.618826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.619490 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.619204 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.619490 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.619244 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.619490 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.619375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.619671 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.619593 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.624481 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.624458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg99v\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v\") pod \"prometheus-k8s-0\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.701736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.701640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:44.849820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:44.849788 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:44.852438 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:20:44.852383 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27db00e_d71f_4250_9b77_a1cc2bbd1756.slice/crio-f2b066614a37da19a7a3916ef3b7533e4b4ab762c1aa442670831e330b1b0956 WatchSource:0}: Error finding container f2b066614a37da19a7a3916ef3b7533e4b4ab762c1aa442670831e330b1b0956: Status 404 returned error can't find the container with id f2b066614a37da19a7a3916ef3b7533e4b4ab762c1aa442670831e330b1b0956 Apr 16 18:20:45.164712 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:45.164686 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57dcf45987-66g2j" Apr 16 18:20:45.245026 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:45.244987 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-886754884-tkmnd" event={"ID":"929b560f-9d1d-4d0b-be84-3095c605bb4c","Type":"ContainerStarted","Data":"de673a72836f6909892b295d9fe5fdb9fc4153704e06de18295918934886b713"} Apr 16 18:20:45.246293 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:45.246254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"f2b066614a37da19a7a3916ef3b7533e4b4ab762c1aa442670831e330b1b0956"} Apr 16 18:20:45.272134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:45.272078 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-886754884-tkmnd" podStartSLOduration=1.791368985 podStartE2EDuration="3.272057645s" podCreationTimestamp="2026-04-16 18:20:42 +0000 UTC" firstStartedPulling="2026-04-16 18:20:43.050346255 +0000 UTC m=+185.957926261" lastFinishedPulling="2026-04-16 18:20:44.531034912 +0000 UTC m=+187.438614921" observedRunningTime="2026-04-16 18:20:45.271114055 +0000 UTC m=+188.178694085" watchObservedRunningTime="2026-04-16 18:20:45.272057645 +0000 UTC m=+188.179637675" Apr 16 18:20:46.250559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:46.250508 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" exitCode=0 Apr 16 18:20:46.251040 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:46.250591 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} Apr 16 18:20:48.484021 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:48.483948 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerName="registry" containerID="cri-o://49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767" gracePeriod=30 Apr 16 18:20:48.945680 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:48.945658 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069687 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069748 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069788 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069831 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q85kb\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069865 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069903 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.069979 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.070046 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca\") pod \"701b7355-0fb2-412e-9af7-09e007fb99bc\" (UID: \"701b7355-0fb2-412e-9af7-09e007fb99bc\") " Apr 16 18:20:49.071108 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.070714 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:49.073329 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.073278 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:49.074554 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.074341 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb" (OuterVolumeSpecName: "kube-api-access-q85kb") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "kube-api-access-q85kb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:49.074956 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.074912 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:49.075496 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.075455 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:49.077559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.077510 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:49.080042 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.080002 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:49.083344 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.083317 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "701b7355-0fb2-412e-9af7-09e007fb99bc" (UID: "701b7355-0fb2-412e-9af7-09e007fb99bc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:49.171744 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171708 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-trusted-ca\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171744 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171740 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-certificates\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171744 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171749 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/701b7355-0fb2-412e-9af7-09e007fb99bc-ca-trust-extracted\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171940 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171761 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-image-registry-private-configuration\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171940 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171771 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q85kb\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-kube-api-access-q85kb\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171940 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171780 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-registry-tls\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171940 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171789 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/701b7355-0fb2-412e-9af7-09e007fb99bc-bound-sa-token\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.171940 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.171797 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/701b7355-0fb2-412e-9af7-09e007fb99bc-installation-pull-secrets\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.260565 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.260535 2579 generic.go:358] "Generic (PLEG): container finished" podID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerID="49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767" exitCode=0 Apr 16 18:20:49.260746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.260594 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" Apr 16 18:20:49.260746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.260618 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" event={"ID":"701b7355-0fb2-412e-9af7-09e007fb99bc","Type":"ContainerDied","Data":"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767"} Apr 16 18:20:49.260746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.260654 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d7cb9877-cnlt9" event={"ID":"701b7355-0fb2-412e-9af7-09e007fb99bc","Type":"ContainerDied","Data":"d9f02e49481fb331c2e010e83545afad27f18690e16d11e19bf69ad6d93bb0af"} Apr 16 18:20:49.260746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.260673 2579 scope.go:117] "RemoveContainer" containerID="49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767" Apr 16 18:20:49.262778 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.262708 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} Apr 16 18:20:49.262778 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.262741 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} Apr 16 18:20:49.273969 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.273945 2579 scope.go:117] "RemoveContainer" containerID="49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767" Apr 16 18:20:49.274662 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:20:49.274642 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767\": container with ID starting with 49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767 not found: ID does not exist" containerID="49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767" Apr 16 18:20:49.274715 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.274670 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767"} err="failed to get container status \"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767\": rpc error: code = NotFound desc = could not find container \"49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767\": container with ID starting with 49a947358df6a04739d8ee022660b39aaf7b1db4d0078a958f0c38dfb127b767 not found: ID does not exist" Apr 16 18:20:49.290120 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.290089 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:20:49.293969 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.293947 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-84d7cb9877-cnlt9"] Apr 16 18:20:49.669090 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:49.669055 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" path="/var/lib/kubelet/pods/701b7355-0fb2-412e-9af7-09e007fb99bc/volumes" Apr 16 18:20:51.272746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:51.272714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} Apr 16 18:20:51.272746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:51.272749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} Apr 16 18:20:51.273179 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:51.272759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} Apr 16 18:20:51.273179 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:51.272767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerStarted","Data":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} Apr 16 18:20:51.305572 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:51.305511 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.5792940130000002 podStartE2EDuration="7.305490769s" podCreationTimestamp="2026-04-16 18:20:44 +0000 UTC" firstStartedPulling="2026-04-16 18:20:44.85435616 +0000 UTC m=+187.761936166" lastFinishedPulling="2026-04-16 18:20:50.580552911 +0000 UTC m=+193.488132922" observedRunningTime="2026-04-16 18:20:51.304015131 +0000 UTC m=+194.211595171" watchObservedRunningTime="2026-04-16 18:20:51.305490769 +0000 UTC m=+194.213070803" Apr 16 18:20:54.702381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:54.702337 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:56.664538 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:56.664507 2579 scope.go:117] "RemoveContainer" containerID="cc215715786f44b2c0bdcb9f4622b015d27472fcf554ab43bcf8260da31cc35e" Apr 16 18:20:57.290154 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:57.290123 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:20:57.290300 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:57.290203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" event={"ID":"f3230a09-30be-4152-ac36-65d7911245a2","Type":"ContainerStarted","Data":"bd83cb233dadbc6ad0ded6fc0e34963835fe42367bd5a03fb99375079b7bd25e"} Apr 16 18:20:57.290513 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:57.290487 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:20:57.309487 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:57.309427 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" podStartSLOduration=58.662875319 podStartE2EDuration="1m1.309411368s" podCreationTimestamp="2026-04-16 18:19:56 +0000 UTC" firstStartedPulling="2026-04-16 18:19:57.428709923 +0000 UTC m=+140.336289930" lastFinishedPulling="2026-04-16 18:20:00.075245972 +0000 UTC m=+142.982825979" observedRunningTime="2026-04-16 18:20:57.308258813 +0000 UTC m=+200.215838837" watchObservedRunningTime="2026-04-16 18:20:57.309411368 +0000 UTC m=+200.216991391" Apr 16 18:20:57.682762 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:20:57.682674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-hjml7" Apr 16 18:21:02.919725 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:02.919693 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:21:02.920191 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:02.919781 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:21:16.343323 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:16.343284 2579 generic.go:358] "Generic (PLEG): container finished" podID="f8f3ed9b-bbc2-445b-9380-064dc07640d5" containerID="221c969e9521c17ce06ff974e4846af995a0d85290779a169d6fd82e7b337e56" exitCode=0 Apr 16 18:21:16.343701 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:16.343359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" event={"ID":"f8f3ed9b-bbc2-445b-9380-064dc07640d5","Type":"ContainerDied","Data":"221c969e9521c17ce06ff974e4846af995a0d85290779a169d6fd82e7b337e56"} Apr 16 18:21:16.343742 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:16.343711 2579 scope.go:117] "RemoveContainer" containerID="221c969e9521c17ce06ff974e4846af995a0d85290779a169d6fd82e7b337e56" Apr 16 18:21:17.348373 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:17.348337 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-6tvbf" event={"ID":"f8f3ed9b-bbc2-445b-9380-064dc07640d5","Type":"ContainerStarted","Data":"e07e68abb4f2bac9738557adf5f5a03e61ee0f49dfa31b649d6e7735111e431f"} Apr 16 18:21:22.925622 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:22.925588 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:21:22.929621 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:22.929589 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-886754884-tkmnd" Apr 16 18:21:44.702499 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:44.702460 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:44.721654 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:44.721626 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:45.440968 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:45.440943 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:49.398518 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:49.398460 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:21:49.400702 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:49.400681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a84444e3-6cab-4290-a61c-c01132150e31-metrics-certs\") pod \"network-metrics-daemon-h2jgj\" (UID: \"a84444e3-6cab-4290-a61c-c01132150e31\") " pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:21:49.569479 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:49.569446 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k7xtv\"" Apr 16 18:21:49.577356 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:49.577337 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h2jgj" Apr 16 18:21:49.693936 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:49.693900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h2jgj"] Apr 16 18:21:49.697532 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:21:49.697504 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda84444e3_6cab_4290_a61c_c01132150e31.slice/crio-86fdcdcb6b891a3c4ca9754a184f35aa2bfd6c1e445b42328b71fbc506e5e3ef WatchSource:0}: Error finding container 86fdcdcb6b891a3c4ca9754a184f35aa2bfd6c1e445b42328b71fbc506e5e3ef: Status 404 returned error can't find the container with id 86fdcdcb6b891a3c4ca9754a184f35aa2bfd6c1e445b42328b71fbc506e5e3ef Apr 16 18:21:50.442216 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:50.442163 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h2jgj" event={"ID":"a84444e3-6cab-4290-a61c-c01132150e31","Type":"ContainerStarted","Data":"86fdcdcb6b891a3c4ca9754a184f35aa2bfd6c1e445b42328b71fbc506e5e3ef"} Apr 16 18:21:51.446811 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:51.446776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h2jgj" event={"ID":"a84444e3-6cab-4290-a61c-c01132150e31","Type":"ContainerStarted","Data":"066a74bb70ce00bffab71980c5b8bd21dc958179258c3a46e97bd6978542fe78"} Apr 16 18:21:51.446811 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:51.446812 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h2jgj" event={"ID":"a84444e3-6cab-4290-a61c-c01132150e31","Type":"ContainerStarted","Data":"d266a7df953f975a8d43b54824cff05ad3dd715f6b96cd900c27296ce4ead00a"} Apr 16 18:21:51.464479 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:21:51.464434 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h2jgj" podStartSLOduration=253.605049698 podStartE2EDuration="4m14.46442182s" podCreationTimestamp="2026-04-16 18:17:37 +0000 UTC" firstStartedPulling="2026-04-16 18:21:49.699367808 +0000 UTC m=+252.606947819" lastFinishedPulling="2026-04-16 18:21:50.558739935 +0000 UTC m=+253.466319941" observedRunningTime="2026-04-16 18:21:51.463436599 +0000 UTC m=+254.371016629" watchObservedRunningTime="2026-04-16 18:21:51.46442182 +0000 UTC m=+254.372001848" Apr 16 18:22:02.801260 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.801173 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:02.802629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802476 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="prometheus" containerID="cri-o://c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" gracePeriod=600 Apr 16 18:22:02.803080 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802783 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-thanos" containerID="cri-o://89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" gracePeriod=600 Apr 16 18:22:02.803301 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802825 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-web" containerID="cri-o://9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" gracePeriod=600 Apr 16 18:22:02.803413 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802842 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy" containerID="cri-o://ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" gracePeriod=600 Apr 16 18:22:02.803800 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802863 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="config-reloader" containerID="cri-o://613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" gracePeriod=600 Apr 16 18:22:02.803800 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:02.802896 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="thanos-sidecar" containerID="cri-o://50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" gracePeriod=600 Apr 16 18:22:03.060127 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.060058 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.106202 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106165 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106212 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106239 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106271 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106308 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106332 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106371 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wg99v\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106413 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106438 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106465 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106497 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106524 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106568 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106610 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106663 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106688 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.106736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106729 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.107200 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.106762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle\") pod \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\" (UID: \"f27db00e-d71f-4250-9b77-a1cc2bbd1756\") " Apr 16 18:22:03.107328 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.107267 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.107422 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.107375 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.107860 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.107521 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.107860 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.107787 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.108534 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.108505 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:03.109477 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.109449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:03.110269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.110239 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out" (OuterVolumeSpecName: "config-out") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:22:03.110377 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.110330 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.110816 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.110776 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.110917 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.110851 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.111589 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.111559 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.112007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.111981 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v" (OuterVolumeSpecName: "kube-api-access-wg99v") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "kube-api-access-wg99v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:03.112180 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.112155 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.112269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.112239 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.112475 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.112447 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.113551 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.113516 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config" (OuterVolumeSpecName: "config") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.113775 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.113741 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:03.125631 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.125593 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config" (OuterVolumeSpecName: "web-config") pod "f27db00e-d71f-4250-9b77-a1cc2bbd1756" (UID: "f27db00e-d71f-4250-9b77-a1cc2bbd1756"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:03.208362 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208321 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208362 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208357 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208362 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208369 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-web-config\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208378 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-db\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208387 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-metrics-client-certs\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208423 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wg99v\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-kube-api-access-wg99v\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208432 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208440 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f27db00e-d71f-4250-9b77-a1cc2bbd1756-config-out\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208450 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-metrics-client-ca\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208458 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-kube-rbac-proxy\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208466 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f27db00e-d71f-4250-9b77-a1cc2bbd1756-tls-assets\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208474 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-grpc-tls\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208482 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208492 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208501 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208510 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f27db00e-d71f-4250-9b77-a1cc2bbd1756-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208519 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.208642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.208527 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f27db00e-d71f-4250-9b77-a1cc2bbd1756-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:22:03.490055 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490025 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" exitCode=0 Apr 16 18:22:03.490055 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490051 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" exitCode=0 Apr 16 18:22:03.490055 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490057 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" exitCode=0 Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490062 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" exitCode=0 Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490068 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" exitCode=0 Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490072 2579 generic.go:358] "Generic (PLEG): container finished" podID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" exitCode=0 Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490147 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490157 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490167 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f27db00e-d71f-4250-9b77-a1cc2bbd1756","Type":"ContainerDied","Data":"f2b066614a37da19a7a3916ef3b7533e4b4ab762c1aa442670831e330b1b0956"} Apr 16 18:22:03.490258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.490212 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.498079 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.498060 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.504905 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.504879 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.511150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.511133 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.514626 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.514605 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:03.518835 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.518803 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.519720 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.519688 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:03.525149 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.525132 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.531667 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.531651 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.538194 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.538174 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.538483 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.538463 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.538556 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.538496 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.538556 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.538524 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.538777 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.538760 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.538818 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.538784 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.538818 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.538800 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.539017 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.539000 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.539073 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539026 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.539073 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539046 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.539278 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.539261 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.539315 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539283 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.539315 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539296 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.539528 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.539511 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.539573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539533 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.539573 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539547 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.539763 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.539748 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.539824 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539771 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.539824 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.539790 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.540008 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:22:03.539991 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.540052 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540012 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.540052 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540026 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.540250 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540234 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.540250 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540250 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.540457 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540437 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.540501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540461 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.540675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540654 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.540719 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540684 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.540874 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540856 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.540926 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.540876 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.541067 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541049 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.541067 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541064 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.541290 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541273 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.541338 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541290 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.541585 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541560 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.541585 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541579 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.541818 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541799 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.541889 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.541819 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.542042 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542024 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.542088 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542043 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.542238 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542221 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.542309 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542240 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.542456 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542440 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.542507 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542457 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.542632 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542614 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.542696 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542633 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.542827 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542807 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.542883 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.542830 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.543017 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543000 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.543065 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543017 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.543229 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543211 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.543289 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543239 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.543459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543438 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.543530 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543460 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.543672 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543656 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.543734 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543673 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.543886 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543860 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.543886 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.543878 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.544097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544080 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.544147 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544098 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.544444 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544413 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.544550 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544441 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.544854 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544827 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.544976 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.544964 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.545438 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545408 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.545438 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545438 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.545717 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545692 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.545794 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545719 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.545966 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545946 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.546017 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.545969 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.546223 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546203 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.546305 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546226 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.546502 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546478 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.546587 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546502 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.546587 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546548 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:03.546765 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546744 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.546765 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546765 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.546957 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546940 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-web" Apr 16 18:22:03.547007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546960 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-web" Apr 16 18:22:03.547007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546976 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="init-config-reloader" Apr 16 18:22:03.547007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546984 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="init-config-reloader" Apr 16 18:22:03.547007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.546995 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerName="registry" Apr 16 18:22:03.547007 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547003 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerName="registry" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547012 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="thanos-sidecar" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547019 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="thanos-sidecar" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547028 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="prometheus" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547023 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547046 2579 scope.go:117] "RemoveContainer" containerID="89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547033 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="prometheus" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547089 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="config-reloader" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547099 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="config-reloader" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547112 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-thanos" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547121 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-thanos" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547140 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy" Apr 16 18:22:03.547150 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547146 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547214 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="701b7355-0fb2-412e-9af7-09e007fb99bc" containerName="registry" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547223 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="config-reloader" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547231 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-thanos" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547237 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="prometheus" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547245 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="thanos-sidecar" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547254 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy-web" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547263 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" containerName="kube-rbac-proxy" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547280 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9"} err="failed to get container status \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": rpc error: code = NotFound desc = could not find container \"89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9\": container with ID starting with 89b386879c14a530c5fd58032891491bd95d704602b659e0d3f807bc646d2da9 not found: ID does not exist" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547297 2579 scope.go:117] "RemoveContainer" containerID="ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547533 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d"} err="failed to get container status \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": rpc error: code = NotFound desc = could not find container \"ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d\": container with ID starting with ffcd4596021c788edb8ab97f3452e061e8fe7beba64519a92f4b2eb7c721897d not found: ID does not exist" Apr 16 18:22:03.547641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547557 2579 scope.go:117] "RemoveContainer" containerID="9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b" Apr 16 18:22:03.548012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547777 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b"} err="failed to get container status \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": rpc error: code = NotFound desc = could not find container \"9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b\": container with ID starting with 9d1bd38f78571e5acc70a49a3b10c038d7ca918180817d4832857c6b9c21707b not found: ID does not exist" Apr 16 18:22:03.548012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.547807 2579 scope.go:117] "RemoveContainer" containerID="50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de" Apr 16 18:22:03.548116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548036 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de"} err="failed to get container status \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": rpc error: code = NotFound desc = could not find container \"50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de\": container with ID starting with 50dc4ca490c283d91b586b542210083e558b36cb745eeee04af3505b411832de not found: ID does not exist" Apr 16 18:22:03.548116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548053 2579 scope.go:117] "RemoveContainer" containerID="613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8" Apr 16 18:22:03.548339 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548258 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8"} err="failed to get container status \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": rpc error: code = NotFound desc = could not find container \"613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8\": container with ID starting with 613c653337a7e4e2b0fa73e95a6a0a84acffd7d08ac341e0c5a324019ab979b8 not found: ID does not exist" Apr 16 18:22:03.548339 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548283 2579 scope.go:117] "RemoveContainer" containerID="c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3" Apr 16 18:22:03.548614 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548587 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3"} err="failed to get container status \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": rpc error: code = NotFound desc = could not find container \"c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3\": container with ID starting with c14cba1de454f9944a413af592ad234e0751bef449d376356e0df3b9ea553bf3 not found: ID does not exist" Apr 16 18:22:03.548614 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548616 2579 scope.go:117] "RemoveContainer" containerID="776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268" Apr 16 18:22:03.548917 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.548882 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268"} err="failed to get container status \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": rpc error: code = NotFound desc = could not find container \"776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268\": container with ID starting with 776e8c6d37eda31d2c08e5ff0b582d5b1eeb31fb9285c86838deeb5d55d2a268 not found: ID does not exist" Apr 16 18:22:03.553025 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.553007 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555498 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555508 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555509 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5oq1hdcp6ot0i\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555508 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555509 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555598 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555509 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555560 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:22:03.555670 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555559 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:22:03.556156 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555996 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:22:03.556156 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.555996 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sqd8w\"" Apr 16 18:22:03.558604 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.558587 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:22:03.561641 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.561624 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:22:03.566824 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.566803 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:03.611579 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611579 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611662 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611738 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.611958 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611946 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzhxx\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-kube-api-access-pzhxx\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.612169 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.612169 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.611988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.612169 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.612030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.612169 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.612054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.612169 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.612069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.668484 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.668448 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27db00e-d71f-4250-9b77-a1cc2bbd1756" path="/var/lib/kubelet/pods/f27db00e-d71f-4250-9b77-a1cc2bbd1756/volumes" Apr 16 18:22:03.713181 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713624 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713517 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713624 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713686 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713730 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713719 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzhxx\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-kube-api-access-pzhxx\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.713890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.714119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.714119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.714119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.713998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.714883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.715010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.715564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.715614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.715886 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716574 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.716318 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.716832 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.716768 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.717346 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.717305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.717825 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.717802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.717921 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.717905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.717994 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.717970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.718358 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.718335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.718616 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.718598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.718694 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.718643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-config\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.718778 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.718758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.719435 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.719417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.720111 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.720095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.723626 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.723603 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzhxx\" (UniqueName: \"kubernetes.io/projected/77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce-kube-api-access-pzhxx\") pod \"prometheus-k8s-0\" (UID: \"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:03.863439 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:03.863375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:04.003095 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:04.003003 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:22:04.005797 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:22:04.005757 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77eb8613_8af8_4e1e_89ff_97a2d8a2b3ce.slice/crio-52dd88f5eec62e9958875f78b2005f57bdafc43926bca36622cba9e28267e919 WatchSource:0}: Error finding container 52dd88f5eec62e9958875f78b2005f57bdafc43926bca36622cba9e28267e919: Status 404 returned error can't find the container with id 52dd88f5eec62e9958875f78b2005f57bdafc43926bca36622cba9e28267e919 Apr 16 18:22:04.494324 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:04.494290 2579 generic.go:358] "Generic (PLEG): container finished" podID="77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce" containerID="3bca2a2a830a94e4dc1f95c91759f9475b80b03a84eabaee813ef3f0b8ab798b" exitCode=0 Apr 16 18:22:04.494522 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:04.494382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerDied","Data":"3bca2a2a830a94e4dc1f95c91759f9475b80b03a84eabaee813ef3f0b8ab798b"} Apr 16 18:22:04.494522 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:04.494436 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"52dd88f5eec62e9958875f78b2005f57bdafc43926bca36622cba9e28267e919"} Apr 16 18:22:05.500790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"6637a369ccf22207a08071fc7a4fd46a64abe40227dd3af580231fb6c4d1cb71"} Apr 16 18:22:05.500790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"5dc158cbea20f79a76d2574700dc3db069aa88718c14dabaa172bf251a046e8c"} Apr 16 18:22:05.500790 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500800 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"7e0c03ec4887ab38d1a4ee96eb9969356bcc24b7b12f36571f13df08f20be5e3"} Apr 16 18:22:05.501501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"899abc0d1d3e5e23ec85e7b9c65529f0076e8ec537cdfb6d3986eb4d53ff9c90"} Apr 16 18:22:05.501501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"92bfab1545fe6f03421df110b4578721ebc441d3db512aea225872311af9e3ef"} Apr 16 18:22:05.501501 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.500830 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce","Type":"ContainerStarted","Data":"41a569a5d87250ea71efdd12862a61d72c58166fa4e131c377a2383c6416a6a5"} Apr 16 18:22:05.529787 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:05.529734 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.529716401 podStartE2EDuration="2.529716401s" podCreationTimestamp="2026-04-16 18:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:22:05.528669622 +0000 UTC m=+268.436249662" watchObservedRunningTime="2026-04-16 18:22:05.529716401 +0000 UTC m=+268.437296433" Apr 16 18:22:08.864568 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:08.864527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:37.537296 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:37.537270 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:22:37.537866 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:37.537547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:22:37.554891 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:22:37.554838 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:23:03.863776 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:23:03.863735 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:23:03.878781 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:23:03.878741 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:23:04.683532 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:23:04.683503 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:25:37.034866 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.034827 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-mwlxs"] Apr 16 18:25:37.038206 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.038190 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.040365 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.040343 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:25:37.040572 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.040558 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:25:37.041232 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.041217 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-kmh8h\"" Apr 16 18:25:37.041232 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.041224 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:25:37.047211 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.047187 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-mwlxs"] Apr 16 18:25:37.161811 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.161778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25d125b-34f0-4d26-ac87-3da371639f2d-cert\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.161959 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.161849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2p2r\" (UniqueName: \"kubernetes.io/projected/a25d125b-34f0-4d26-ac87-3da371639f2d-kube-api-access-s2p2r\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.262489 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.262455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2p2r\" (UniqueName: \"kubernetes.io/projected/a25d125b-34f0-4d26-ac87-3da371639f2d-kube-api-access-s2p2r\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.262620 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.262511 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25d125b-34f0-4d26-ac87-3da371639f2d-cert\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.264772 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.264753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25d125b-34f0-4d26-ac87-3da371639f2d-cert\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.271040 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.271018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2p2r\" (UniqueName: \"kubernetes.io/projected/a25d125b-34f0-4d26-ac87-3da371639f2d-kube-api-access-s2p2r\") pod \"odh-model-controller-696fc77849-mwlxs\" (UID: \"a25d125b-34f0-4d26-ac87-3da371639f2d\") " pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.349011 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.348930 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:37.670305 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.670235 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-mwlxs"] Apr 16 18:25:37.673286 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:25:37.673259 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda25d125b_34f0_4d26_ac87_3da371639f2d.slice/crio-c46fb1940c4b2d098b72b3e17d0973e37b9a5bf6fd4c558eb5afbe19574df57e WatchSource:0}: Error finding container c46fb1940c4b2d098b72b3e17d0973e37b9a5bf6fd4c558eb5afbe19574df57e: Status 404 returned error can't find the container with id c46fb1940c4b2d098b72b3e17d0973e37b9a5bf6fd4c558eb5afbe19574df57e Apr 16 18:25:37.674429 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:37.674411 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:25:38.079325 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:38.079291 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-mwlxs" event={"ID":"a25d125b-34f0-4d26-ac87-3da371639f2d","Type":"ContainerStarted","Data":"c46fb1940c4b2d098b72b3e17d0973e37b9a5bf6fd4c558eb5afbe19574df57e"} Apr 16 18:25:40.088106 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:40.088071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-mwlxs" event={"ID":"a25d125b-34f0-4d26-ac87-3da371639f2d","Type":"ContainerStarted","Data":"ec20531b70310114eabf84e887c85f50113350f1243f02135101e4cdb0bdbbb5"} Apr 16 18:25:40.088505 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:40.088130 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:40.105737 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:40.105684 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-mwlxs" podStartSLOduration=0.786209054 podStartE2EDuration="3.105667801s" podCreationTimestamp="2026-04-16 18:25:37 +0000 UTC" firstStartedPulling="2026-04-16 18:25:37.674559443 +0000 UTC m=+480.582139453" lastFinishedPulling="2026-04-16 18:25:39.994018194 +0000 UTC m=+482.901598200" observedRunningTime="2026-04-16 18:25:40.10490408 +0000 UTC m=+483.012484108" watchObservedRunningTime="2026-04-16 18:25:40.105667801 +0000 UTC m=+483.013247829" Apr 16 18:25:51.093704 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.093668 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-mwlxs" Apr 16 18:25:51.885922 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.885888 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-zs26v"] Apr 16 18:25:51.889093 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.889070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zs26v" Apr 16 18:25:51.891321 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.891291 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:25:51.891465 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.891362 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wg9bc\"" Apr 16 18:25:51.896486 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.896462 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zs26v"] Apr 16 18:25:51.992664 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:51.992621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwlsw\" (UniqueName: \"kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw\") pod \"s3-init-zs26v\" (UID: \"c3cd2a35-0050-49f4-a252-c823aa4a04a8\") " pod="kserve/s3-init-zs26v" Apr 16 18:25:52.093684 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:52.093651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwlsw\" (UniqueName: \"kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw\") pod \"s3-init-zs26v\" (UID: \"c3cd2a35-0050-49f4-a252-c823aa4a04a8\") " pod="kserve/s3-init-zs26v" Apr 16 18:25:52.104755 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:52.104726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwlsw\" (UniqueName: \"kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw\") pod \"s3-init-zs26v\" (UID: \"c3cd2a35-0050-49f4-a252-c823aa4a04a8\") " pod="kserve/s3-init-zs26v" Apr 16 18:25:52.212650 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:52.212569 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zs26v" Apr 16 18:25:52.329718 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:52.329576 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-zs26v"] Apr 16 18:25:52.332474 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:25:52.332450 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3cd2a35_0050_49f4_a252_c823aa4a04a8.slice/crio-4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47 WatchSource:0}: Error finding container 4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47: Status 404 returned error can't find the container with id 4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47 Apr 16 18:25:53.125866 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:53.125823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zs26v" event={"ID":"c3cd2a35-0050-49f4-a252-c823aa4a04a8","Type":"ContainerStarted","Data":"4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47"} Apr 16 18:25:57.139060 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:57.139023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zs26v" event={"ID":"c3cd2a35-0050-49f4-a252-c823aa4a04a8","Type":"ContainerStarted","Data":"ea86f9deccc2aeeb982308c6b5191e289a9674908a141ac13a55cec0d4019731"} Apr 16 18:25:57.159559 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:25:57.159509 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-zs26v" podStartSLOduration=1.768184239 podStartE2EDuration="6.159494267s" podCreationTimestamp="2026-04-16 18:25:51 +0000 UTC" firstStartedPulling="2026-04-16 18:25:52.334281463 +0000 UTC m=+495.241861483" lastFinishedPulling="2026-04-16 18:25:56.725591502 +0000 UTC m=+499.633171511" observedRunningTime="2026-04-16 18:25:57.157753546 +0000 UTC m=+500.065333581" watchObservedRunningTime="2026-04-16 18:25:57.159494267 +0000 UTC m=+500.067074330" Apr 16 18:26:00.149012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:00.148898 2579 generic.go:358] "Generic (PLEG): container finished" podID="c3cd2a35-0050-49f4-a252-c823aa4a04a8" containerID="ea86f9deccc2aeeb982308c6b5191e289a9674908a141ac13a55cec0d4019731" exitCode=0 Apr 16 18:26:00.149012 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:00.148988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zs26v" event={"ID":"c3cd2a35-0050-49f4-a252-c823aa4a04a8","Type":"ContainerDied","Data":"ea86f9deccc2aeeb982308c6b5191e289a9674908a141ac13a55cec0d4019731"} Apr 16 18:26:01.281916 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:01.281893 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zs26v" Apr 16 18:26:01.369983 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:01.369946 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwlsw\" (UniqueName: \"kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw\") pod \"c3cd2a35-0050-49f4-a252-c823aa4a04a8\" (UID: \"c3cd2a35-0050-49f4-a252-c823aa4a04a8\") " Apr 16 18:26:01.372116 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:01.372081 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw" (OuterVolumeSpecName: "kube-api-access-mwlsw") pod "c3cd2a35-0050-49f4-a252-c823aa4a04a8" (UID: "c3cd2a35-0050-49f4-a252-c823aa4a04a8"). InnerVolumeSpecName "kube-api-access-mwlsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:26:01.471118 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:01.471023 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mwlsw\" (UniqueName: \"kubernetes.io/projected/c3cd2a35-0050-49f4-a252-c823aa4a04a8-kube-api-access-mwlsw\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:26:02.155895 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:02.155860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-zs26v" event={"ID":"c3cd2a35-0050-49f4-a252-c823aa4a04a8","Type":"ContainerDied","Data":"4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47"} Apr 16 18:26:02.155895 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:02.155896 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce2ed021319f8d18f8bea2af7da11c586b528ca1e7b0bfdd8a52786fa1f6b47" Apr 16 18:26:02.155895 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:02.155872 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-zs26v" Apr 16 18:26:09.815565 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.815528 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:26:09.815963 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.815949 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3cd2a35-0050-49f4-a252-c823aa4a04a8" containerName="s3-init" Apr 16 18:26:09.816005 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.815967 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cd2a35-0050-49f4-a252-c823aa4a04a8" containerName="s3-init" Apr 16 18:26:09.816052 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.816041 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3cd2a35-0050-49f4-a252-c823aa4a04a8" containerName="s3-init" Apr 16 18:26:09.819381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.819362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:26:09.821807 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.821779 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-kt2gs\"" Apr 16 18:26:09.827254 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.826798 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:26:09.831368 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.831339 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:26:09.972315 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:09.972281 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:26:09.976092 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:26:09.976064 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c3cbdb_b4e6_4cdb_8bb6_dbf94be0c910.slice/crio-ae74ba606c533be463e3566bc3b73d74595ed68deae9d06c35014c575863ac1d WatchSource:0}: Error finding container ae74ba606c533be463e3566bc3b73d74595ed68deae9d06c35014c575863ac1d: Status 404 returned error can't find the container with id ae74ba606c533be463e3566bc3b73d74595ed68deae9d06c35014c575863ac1d Apr 16 18:26:10.182370 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.182288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" event={"ID":"a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910","Type":"ContainerStarted","Data":"ae74ba606c533be463e3566bc3b73d74595ed68deae9d06c35014c575863ac1d"} Apr 16 18:26:10.209471 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.209436 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:26:10.213689 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.213657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:10.221500 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.221298 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:26:10.348176 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.348143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-vzzzx\" (UID: \"2bb728d0-b8e7-4e33-8db9-f06483b7bc29\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:10.449602 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.449505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-vzzzx\" (UID: \"2bb728d0-b8e7-4e33-8db9-f06483b7bc29\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:10.449891 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.449869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-555687cc47-vzzzx\" (UID: \"2bb728d0-b8e7-4e33-8db9-f06483b7bc29\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:10.524960 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.524915 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:10.719763 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:10.719524 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:26:11.189411 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:11.189317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerStarted","Data":"8d3a079d2b4aaaeb2e55094142fc47a92807d1d071898f9a7badcdba38d7535d"} Apr 16 18:26:24.236574 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:24.236523 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerStarted","Data":"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051"} Apr 16 18:26:24.238416 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:24.238375 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" event={"ID":"a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910","Type":"ContainerStarted","Data":"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2"} Apr 16 18:26:24.238596 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:24.238576 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:26:24.239822 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:24.239797 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:26:24.272913 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:24.272853 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podStartSLOduration=1.425605804 podStartE2EDuration="15.272835015s" podCreationTimestamp="2026-04-16 18:26:09 +0000 UTC" firstStartedPulling="2026-04-16 18:26:09.977992746 +0000 UTC m=+512.885572765" lastFinishedPulling="2026-04-16 18:26:23.825221966 +0000 UTC m=+526.732801976" observedRunningTime="2026-04-16 18:26:24.27260455 +0000 UTC m=+527.180184571" watchObservedRunningTime="2026-04-16 18:26:24.272835015 +0000 UTC m=+527.180415043" Apr 16 18:26:25.241799 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:25.241759 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:26:27.248647 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:27.248612 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerID="ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051" exitCode=0 Apr 16 18:26:27.248647 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:27.248653 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerDied","Data":"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051"} Apr 16 18:26:33.270746 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:33.270709 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerStarted","Data":"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984"} Apr 16 18:26:33.271134 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:33.271005 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:26:33.272417 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:33.272371 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:26:33.288539 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:33.288481 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podStartSLOduration=1.2108873390000001 podStartE2EDuration="23.288440342s" podCreationTimestamp="2026-04-16 18:26:10 +0000 UTC" firstStartedPulling="2026-04-16 18:26:10.727041281 +0000 UTC m=+513.634621295" lastFinishedPulling="2026-04-16 18:26:32.804594288 +0000 UTC m=+535.712174298" observedRunningTime="2026-04-16 18:26:33.287497142 +0000 UTC m=+536.195077203" watchObservedRunningTime="2026-04-16 18:26:33.288440342 +0000 UTC m=+536.196020369" Apr 16 18:26:34.274619 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:34.274583 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:26:35.242159 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:35.242107 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:26:44.275278 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:44.275232 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:26:45.242627 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:45.242575 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:26:54.275564 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:54.275517 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:26:55.242684 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:26:55.242644 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:27:04.275095 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:04.275040 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:27:05.242816 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:05.242774 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 16 18:27:14.275502 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:14.275451 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:27:15.243450 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:15.243413 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:27:24.274763 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:24.274720 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:27:34.275065 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:34.275020 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 16 18:27:37.566145 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:37.566116 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:27:37.568095 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:37.568074 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:27:40.037280 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.037250 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:27:40.037659 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.037533 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" containerID="cri-o://55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2" gracePeriod=30 Apr 16 18:27:40.070703 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.070670 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:27:40.073404 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.073377 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:27:40.082967 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.082938 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:27:40.083469 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.083451 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:27:40.223210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.223187 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:27:40.473910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.473872 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" event={"ID":"64ee8359-c36f-4aea-af5f-1aa8e35a8a5f","Type":"ContainerStarted","Data":"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8"} Apr 16 18:27:40.473910 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.473910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" event={"ID":"64ee8359-c36f-4aea-af5f-1aa8e35a8a5f","Type":"ContainerStarted","Data":"9cb5291882aa8cd694b07730a2f6acb8ddfe8fc437c7f49953c2f10bdfb258d5"} Apr 16 18:27:40.474165 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.474072 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:27:40.475480 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.475449 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:27:40.493418 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:40.493346 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podStartSLOduration=0.493332422 podStartE2EDuration="493.332422ms" podCreationTimestamp="2026-04-16 18:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:40.491100253 +0000 UTC m=+603.398680283" watchObservedRunningTime="2026-04-16 18:27:40.493332422 +0000 UTC m=+603.400912450" Apr 16 18:27:41.477402 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:41.477347 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:27:43.173895 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.173868 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:27:43.483574 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.483486 2579 generic.go:358] "Generic (PLEG): container finished" podID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerID="55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2" exitCode=0 Apr 16 18:27:43.483574 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.483544 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" Apr 16 18:27:43.483755 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.483574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" event={"ID":"a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910","Type":"ContainerDied","Data":"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2"} Apr 16 18:27:43.483755 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.483610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb" event={"ID":"a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910","Type":"ContainerDied","Data":"ae74ba606c533be463e3566bc3b73d74595ed68deae9d06c35014c575863ac1d"} Apr 16 18:27:43.483755 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.483627 2579 scope.go:117] "RemoveContainer" containerID="55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2" Apr 16 18:27:43.491952 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.491932 2579 scope.go:117] "RemoveContainer" containerID="55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2" Apr 16 18:27:43.492228 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:27:43.492210 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2\": container with ID starting with 55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2 not found: ID does not exist" containerID="55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2" Apr 16 18:27:43.492291 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.492241 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2"} err="failed to get container status \"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2\": rpc error: code = NotFound desc = could not find container \"55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2\": container with ID starting with 55d117255ba2b55fba41eaa32a50ce6edb5ca654c8746b23c2f2c830c751fbf2 not found: ID does not exist" Apr 16 18:27:43.518521 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.518486 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:27:43.527215 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.527185 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-3596a-predictor-7fd7565d7-hcqrb"] Apr 16 18:27:43.668948 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:43.668907 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" path="/var/lib/kubelet/pods/a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910/volumes" Apr 16 18:27:44.276174 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:44.276145 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:27:51.477817 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:27:51.477773 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:28:01.478006 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:01.477909 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:28:11.477731 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:11.477681 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:28:21.478454 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:21.478385 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 18:28:29.801300 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.801264 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:28:29.801799 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.801549 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" containerID="cri-o://54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984" gracePeriod=30 Apr 16 18:28:29.854598 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.854566 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:28:29.854952 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.854936 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" Apr 16 18:28:29.855033 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.854956 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" Apr 16 18:28:29.855033 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.855027 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4c3cbdb-b4e6-4cdb-8bb6-dbf94be0c910" containerName="kserve-container" Apr 16 18:28:29.857896 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.857859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:28:29.866882 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.866856 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:28:29.868016 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:29.868001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:28:30.004259 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.004224 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:28:30.007490 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:28:30.007461 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04c8be21_b8e6_400f_a327_f677c1e7a15e.slice/crio-c354fb9d1c2ee09ea78c9b5432ff42d2a485f9687d50fb404fbf1a6e68c3926d WatchSource:0}: Error finding container c354fb9d1c2ee09ea78c9b5432ff42d2a485f9687d50fb404fbf1a6e68c3926d: Status 404 returned error can't find the container with id c354fb9d1c2ee09ea78c9b5432ff42d2a485f9687d50fb404fbf1a6e68c3926d Apr 16 18:28:30.625228 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.625186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" event={"ID":"04c8be21-b8e6-400f-a327-f677c1e7a15e","Type":"ContainerStarted","Data":"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86"} Apr 16 18:28:30.625228 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.625230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" event={"ID":"04c8be21-b8e6-400f-a327-f677c1e7a15e","Type":"ContainerStarted","Data":"c354fb9d1c2ee09ea78c9b5432ff42d2a485f9687d50fb404fbf1a6e68c3926d"} Apr 16 18:28:30.625462 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.625427 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:28:30.626894 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.626868 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:28:30.642912 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:30.642860 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podStartSLOduration=1.642846251 podStartE2EDuration="1.642846251s" podCreationTimestamp="2026-04-16 18:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:30.642307034 +0000 UTC m=+653.549887085" watchObservedRunningTime="2026-04-16 18:28:30.642846251 +0000 UTC m=+653.550426279" Apr 16 18:28:31.479121 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:31.479087 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:28:31.628685 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:31.628651 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:28:34.040359 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.040333 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:28:34.069659 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.069634 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location\") pod \"2bb728d0-b8e7-4e33-8db9-f06483b7bc29\" (UID: \"2bb728d0-b8e7-4e33-8db9-f06483b7bc29\") " Apr 16 18:28:34.070045 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.070017 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2bb728d0-b8e7-4e33-8db9-f06483b7bc29" (UID: "2bb728d0-b8e7-4e33-8db9-f06483b7bc29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:34.170532 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.170449 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2bb728d0-b8e7-4e33-8db9-f06483b7bc29-kserve-provision-location\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 18:28:34.637831 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.637793 2579 generic.go:358] "Generic (PLEG): container finished" podID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerID="54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984" exitCode=0 Apr 16 18:28:34.638004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.637864 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" Apr 16 18:28:34.638004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.637875 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerDied","Data":"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984"} Apr 16 18:28:34.638004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.637918 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx" event={"ID":"2bb728d0-b8e7-4e33-8db9-f06483b7bc29","Type":"ContainerDied","Data":"8d3a079d2b4aaaeb2e55094142fc47a92807d1d071898f9a7badcdba38d7535d"} Apr 16 18:28:34.638004 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.637934 2579 scope.go:117] "RemoveContainer" containerID="54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984" Apr 16 18:28:34.653675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.653619 2579 scope.go:117] "RemoveContainer" containerID="ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051" Apr 16 18:28:34.660892 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.660877 2579 scope.go:117] "RemoveContainer" containerID="54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984" Apr 16 18:28:34.661128 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:28:34.661108 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984\": container with ID starting with 54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984 not found: ID does not exist" containerID="54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984" Apr 16 18:28:34.661197 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.661141 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984"} err="failed to get container status \"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984\": rpc error: code = NotFound desc = could not find container \"54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984\": container with ID starting with 54c1fc262dd2877237f5556a7d8c1cf0c379c2b24d8fce1f6e020805bfbd3984 not found: ID does not exist" Apr 16 18:28:34.661197 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.661166 2579 scope.go:117] "RemoveContainer" containerID="ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051" Apr 16 18:28:34.661415 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:28:34.661386 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051\": container with ID starting with ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051 not found: ID does not exist" containerID="ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051" Apr 16 18:28:34.661461 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.661421 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051"} err="failed to get container status \"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051\": rpc error: code = NotFound desc = could not find container \"ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051\": container with ID starting with ac7823411ba8607a86ccd17d38d40bb78afd5c0b89acb6fbb13b005a48eb0051 not found: ID does not exist" Apr 16 18:28:34.665161 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.665139 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:28:34.669529 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:34.669512 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-555687cc47-vzzzx"] Apr 16 18:28:35.667974 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:35.667941 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" path="/var/lib/kubelet/pods/2bb728d0-b8e7-4e33-8db9-f06483b7bc29/volumes" Apr 16 18:28:41.628960 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:41.628896 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:28:51.629348 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:28:51.629295 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:29:01.629739 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:29:01.629689 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:29:11.629723 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:29:11.629676 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 16 18:29:21.629599 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:29:21.629566 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:32:37.587885 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:32:37.587855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:32:37.589629 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:32:37.589607 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:37:04.976432 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:04.976344 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:37:04.976951 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:04.976660 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" containerID="cri-o://10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8" gracePeriod=30 Apr 16 18:37:05.026701 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.026669 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:37:05.026982 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.026970 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="storage-initializer" Apr 16 18:37:05.026982 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.026984 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="storage-initializer" Apr 16 18:37:05.027062 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.027002 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" Apr 16 18:37:05.027062 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.027008 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" Apr 16 18:37:05.027062 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.027056 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bb728d0-b8e7-4e33-8db9-f06483b7bc29" containerName="kserve-container" Apr 16 18:37:05.029919 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.029903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:37:05.042411 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.042375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:37:05.050879 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.050856 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:37:05.184351 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.184313 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:37:05.187574 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:37:05.187543 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6052ebbe_2eda_4f08_b6c5_eef27bd21433.slice/crio-d0d1bc0419b25ed1c471184b3d5e03530b443b16a64e93f0a248d3ce2e0a9b28 WatchSource:0}: Error finding container d0d1bc0419b25ed1c471184b3d5e03530b443b16a64e93f0a248d3ce2e0a9b28: Status 404 returned error can't find the container with id d0d1bc0419b25ed1c471184b3d5e03530b443b16a64e93f0a248d3ce2e0a9b28 Apr 16 18:37:05.189423 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:05.189372 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:37:06.132725 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:06.132692 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" event={"ID":"6052ebbe-2eda-4f08-b6c5-eef27bd21433","Type":"ContainerStarted","Data":"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4"} Apr 16 18:37:06.132725 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:06.132728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" event={"ID":"6052ebbe-2eda-4f08-b6c5-eef27bd21433","Type":"ContainerStarted","Data":"d0d1bc0419b25ed1c471184b3d5e03530b443b16a64e93f0a248d3ce2e0a9b28"} Apr 16 18:37:06.133147 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:06.132935 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:37:06.134210 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:06.134186 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:07.136576 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:07.136536 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:08.119171 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.119149 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:37:08.138180 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.138135 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podStartSLOduration=3.138117595 podStartE2EDuration="3.138117595s" podCreationTimestamp="2026-04-16 18:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:06.151931079 +0000 UTC m=+1169.059511107" watchObservedRunningTime="2026-04-16 18:37:08.138117595 +0000 UTC m=+1171.045697623" Apr 16 18:37:08.139942 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.139915 2579 generic.go:358] "Generic (PLEG): container finished" podID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerID="10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8" exitCode=0 Apr 16 18:37:08.140066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.139971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" event={"ID":"64ee8359-c36f-4aea-af5f-1aa8e35a8a5f","Type":"ContainerDied","Data":"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8"} Apr 16 18:37:08.140066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.139995 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" event={"ID":"64ee8359-c36f-4aea-af5f-1aa8e35a8a5f","Type":"ContainerDied","Data":"9cb5291882aa8cd694b07730a2f6acb8ddfe8fc437c7f49953c2f10bdfb258d5"} Apr 16 18:37:08.140066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.140011 2579 scope.go:117] "RemoveContainer" containerID="10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8" Apr 16 18:37:08.140066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.139974 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl" Apr 16 18:37:08.149000 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.148989 2579 scope.go:117] "RemoveContainer" containerID="10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8" Apr 16 18:37:08.149269 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:37:08.149248 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8\": container with ID starting with 10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8 not found: ID does not exist" containerID="10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8" Apr 16 18:37:08.149359 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.149276 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8"} err="failed to get container status \"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8\": rpc error: code = NotFound desc = could not find container \"10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8\": container with ID starting with 10d7351b07f477dd0254293cdbecf1f8df7731e35de5c9315fcf756f89014fb8 not found: ID does not exist" Apr 16 18:37:08.164645 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.164624 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:37:08.169773 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:08.169749 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-6138a-predictor-57b886cd85-wlhjl"] Apr 16 18:37:09.668714 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:09.668684 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" path="/var/lib/kubelet/pods/64ee8359-c36f-4aea-af5f-1aa8e35a8a5f/volumes" Apr 16 18:37:17.136654 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:17.136609 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:27.137289 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:27.137241 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:37.137242 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:37.137193 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:37.609126 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:37.609100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:37:37.611827 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:37.611803 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:37:47.137523 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:47.137478 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:37:54.677295 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.677256 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:37:54.677661 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.677510 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" containerID="cri-o://89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86" gracePeriod=30 Apr 16 18:37:54.696870 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.696836 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:37:54.697168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.697156 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" Apr 16 18:37:54.697223 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.697169 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" Apr 16 18:37:54.697223 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.697220 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="64ee8359-c36f-4aea-af5f-1aa8e35a8a5f" containerName="kserve-container" Apr 16 18:37:54.700986 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.700966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:37:54.707897 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.707864 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:37:54.710909 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.710887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:37:54.840785 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:54.840752 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:37:54.843917 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:37:54.843892 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24dda508_a2ca_4338_bff6_5989f30bc584.slice/crio-df17091705e542e32b847823d1ce39c5fe73faa6378036fd419831047f010dfb WatchSource:0}: Error finding container df17091705e542e32b847823d1ce39c5fe73faa6378036fd419831047f010dfb: Status 404 returned error can't find the container with id df17091705e542e32b847823d1ce39c5fe73faa6378036fd419831047f010dfb Apr 16 18:37:55.279639 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:55.279597 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" event={"ID":"24dda508-a2ca-4338-bff6-5989f30bc584","Type":"ContainerStarted","Data":"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c"} Apr 16 18:37:55.279639 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:55.279636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" event={"ID":"24dda508-a2ca-4338-bff6-5989f30bc584","Type":"ContainerStarted","Data":"df17091705e542e32b847823d1ce39c5fe73faa6378036fd419831047f010dfb"} Apr 16 18:37:55.279864 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:55.279816 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:37:55.281164 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:55.281138 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:37:55.296315 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:55.296273 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podStartSLOduration=1.29626155 podStartE2EDuration="1.29626155s" podCreationTimestamp="2026-04-16 18:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:55.294592823 +0000 UTC m=+1218.202172851" watchObservedRunningTime="2026-04-16 18:37:55.29626155 +0000 UTC m=+1218.203841579" Apr 16 18:37:56.282251 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:56.282212 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:37:57.138107 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:57.138067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:37:57.831224 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:57.831198 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:37:58.288939 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.288905 2579 generic.go:358] "Generic (PLEG): container finished" podID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerID="89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86" exitCode=0 Apr 16 18:37:58.289097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.288970 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" Apr 16 18:37:58.289097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.288986 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" event={"ID":"04c8be21-b8e6-400f-a327-f677c1e7a15e","Type":"ContainerDied","Data":"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86"} Apr 16 18:37:58.289097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.289022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls" event={"ID":"04c8be21-b8e6-400f-a327-f677c1e7a15e","Type":"ContainerDied","Data":"c354fb9d1c2ee09ea78c9b5432ff42d2a485f9687d50fb404fbf1a6e68c3926d"} Apr 16 18:37:58.289097 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.289040 2579 scope.go:117] "RemoveContainer" containerID="89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86" Apr 16 18:37:58.302469 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.302449 2579 scope.go:117] "RemoveContainer" containerID="89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86" Apr 16 18:37:58.302702 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:37:58.302685 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86\": container with ID starting with 89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86 not found: ID does not exist" containerID="89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86" Apr 16 18:37:58.302750 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.302709 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86"} err="failed to get container status \"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86\": rpc error: code = NotFound desc = could not find container \"89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86\": container with ID starting with 89b724562615f03d77b35c4687487884fe6578156aaedf7876a12db0173b5a86 not found: ID does not exist" Apr 16 18:37:58.313379 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.313357 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:37:58.317667 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:58.317646 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b16e7-predictor-57c5fdffb5-gqhls"] Apr 16 18:37:59.668843 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:37:59.668808 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" path="/var/lib/kubelet/pods/04c8be21-b8e6-400f-a327-f677c1e7a15e/volumes" Apr 16 18:38:06.282894 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:06.282848 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:38:16.282694 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:16.282650 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:38:25.361475 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.361441 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:38:25.361931 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.361914 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" Apr 16 18:38:25.361996 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.361934 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" Apr 16 18:38:25.362046 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.362013 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="04c8be21-b8e6-400f-a327-f677c1e7a15e" containerName="kserve-container" Apr 16 18:38:25.364963 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.364942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:38:25.374279 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.374252 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:38:25.376048 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.376030 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:38:25.378623 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.378601 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:38:25.378862 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.378839 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" containerID="cri-o://5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4" gracePeriod=30 Apr 16 18:38:25.504763 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:25.504706 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:38:25.507407 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:38:25.507361 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97424df_8707_43bc_8e27_47c356e78bc4.slice/crio-af876f9c057a35226bec28d7153fac99ad73815a5d1858626344e2a59294a3f7 WatchSource:0}: Error finding container af876f9c057a35226bec28d7153fac99ad73815a5d1858626344e2a59294a3f7: Status 404 returned error can't find the container with id af876f9c057a35226bec28d7153fac99ad73815a5d1858626344e2a59294a3f7 Apr 16 18:38:26.282675 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.282632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:38:26.372589 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.372555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" event={"ID":"d97424df-8707-43bc-8e27-47c356e78bc4","Type":"ContainerStarted","Data":"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480"} Apr 16 18:38:26.372589 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.372594 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" event={"ID":"d97424df-8707-43bc-8e27-47c356e78bc4","Type":"ContainerStarted","Data":"af876f9c057a35226bec28d7153fac99ad73815a5d1858626344e2a59294a3f7"} Apr 16 18:38:26.372982 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.372707 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:38:26.374030 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.374006 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:38:26.389535 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:26.389487 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podStartSLOduration=1.38947387 podStartE2EDuration="1.38947387s" podCreationTimestamp="2026-04-16 18:38:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:26.388497138 +0000 UTC m=+1249.296077164" watchObservedRunningTime="2026-04-16 18:38:26.38947387 +0000 UTC m=+1249.297053899" Apr 16 18:38:27.137514 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:27.137469 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 16 18:38:27.375988 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:27.375950 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:38:29.321593 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.321572 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:38:29.382109 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.382022 2579 generic.go:358] "Generic (PLEG): container finished" podID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerID="5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4" exitCode=0 Apr 16 18:38:29.382109 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.382069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" event={"ID":"6052ebbe-2eda-4f08-b6c5-eef27bd21433","Type":"ContainerDied","Data":"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4"} Apr 16 18:38:29.382109 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.382083 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" Apr 16 18:38:29.382109 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.382101 2579 scope.go:117] "RemoveContainer" containerID="5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4" Apr 16 18:38:29.382413 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.382092 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk" event={"ID":"6052ebbe-2eda-4f08-b6c5-eef27bd21433","Type":"ContainerDied","Data":"d0d1bc0419b25ed1c471184b3d5e03530b443b16a64e93f0a248d3ce2e0a9b28"} Apr 16 18:38:29.389974 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.389955 2579 scope.go:117] "RemoveContainer" containerID="5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4" Apr 16 18:38:29.390250 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:38:29.390234 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4\": container with ID starting with 5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4 not found: ID does not exist" containerID="5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4" Apr 16 18:38:29.390296 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.390258 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4"} err="failed to get container status \"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4\": rpc error: code = NotFound desc = could not find container \"5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4\": container with ID starting with 5f9b7d76d03d8c03b93e7115c96f192883901264e181712e0425c5ad0e5757c4 not found: ID does not exist" Apr 16 18:38:29.403794 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.403762 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:38:29.407481 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.407459 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-091ad-predictor-bd77f565-lmnzk"] Apr 16 18:38:29.668118 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:29.668039 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" path="/var/lib/kubelet/pods/6052ebbe-2eda-4f08-b6c5-eef27bd21433/volumes" Apr 16 18:38:36.283269 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:36.283224 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:38:37.376556 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:37.376515 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:38:46.283585 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:46.283554 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:38:47.376794 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:47.376752 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:38:57.376594 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:38:57.376552 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:39:07.377018 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:07.376970 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 18:39:14.936693 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.936659 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:39:14.937066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.936986 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" Apr 16 18:39:14.937066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.936997 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" Apr 16 18:39:14.937066 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.937043 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6052ebbe-2eda-4f08-b6c5-eef27bd21433" containerName="kserve-container" Apr 16 18:39:14.939854 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.939831 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:39:14.940136 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.940113 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:39:14.940358 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.940335 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" containerID="cri-o://75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c" gracePeriod=30 Apr 16 18:39:14.947569 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.947550 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:39:14.950258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:14.950243 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:39:15.078642 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.078615 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:39:15.081590 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:39:15.081560 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e13a8de_484c_49a5_afbc_663232b15c46.slice/crio-a18406b26db658739b9fbeee68e6b20d2832b94f806613d1f70e14af40d7ab96 WatchSource:0}: Error finding container a18406b26db658739b9fbeee68e6b20d2832b94f806613d1f70e14af40d7ab96: Status 404 returned error can't find the container with id a18406b26db658739b9fbeee68e6b20d2832b94f806613d1f70e14af40d7ab96 Apr 16 18:39:15.516420 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.516366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" event={"ID":"0e13a8de-484c-49a5-afbc-663232b15c46","Type":"ContainerStarted","Data":"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe"} Apr 16 18:39:15.516420 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.516425 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" event={"ID":"0e13a8de-484c-49a5-afbc-663232b15c46","Type":"ContainerStarted","Data":"a18406b26db658739b9fbeee68e6b20d2832b94f806613d1f70e14af40d7ab96"} Apr 16 18:39:15.516672 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.516522 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:39:15.517655 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.517630 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:39:15.532561 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:15.532510 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podStartSLOduration=1.532495081 podStartE2EDuration="1.532495081s" podCreationTimestamp="2026-04-16 18:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:39:15.530939498 +0000 UTC m=+1298.438519529" watchObservedRunningTime="2026-04-16 18:39:15.532495081 +0000 UTC m=+1298.440075109" Apr 16 18:39:16.282546 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:16.282494 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 18:39:16.519478 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:16.519445 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:39:17.377535 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:17.377502 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:39:18.076874 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.076834 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:39:18.528009 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.527975 2579 generic.go:358] "Generic (PLEG): container finished" podID="24dda508-a2ca-4338-bff6-5989f30bc584" containerID="75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c" exitCode=0 Apr 16 18:39:18.528415 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.528035 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" Apr 16 18:39:18.528415 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.528065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" event={"ID":"24dda508-a2ca-4338-bff6-5989f30bc584","Type":"ContainerDied","Data":"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c"} Apr 16 18:39:18.528415 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.528102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg" event={"ID":"24dda508-a2ca-4338-bff6-5989f30bc584","Type":"ContainerDied","Data":"df17091705e542e32b847823d1ce39c5fe73faa6378036fd419831047f010dfb"} Apr 16 18:39:18.528415 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.528118 2579 scope.go:117] "RemoveContainer" containerID="75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c" Apr 16 18:39:18.536036 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.536020 2579 scope.go:117] "RemoveContainer" containerID="75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c" Apr 16 18:39:18.536320 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:39:18.536294 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c\": container with ID starting with 75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c not found: ID does not exist" containerID="75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c" Apr 16 18:39:18.536431 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.536327 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c"} err="failed to get container status \"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c\": rpc error: code = NotFound desc = could not find container \"75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c\": container with ID starting with 75810951ca791ee9937d81342f64cef070e652f56127468d5fd7b4c4c4b46f8c not found: ID does not exist" Apr 16 18:39:18.550163 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.550141 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:39:18.557622 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:18.557602 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0f5d9-predictor-5bc668d579-ggjbg"] Apr 16 18:39:19.669522 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:19.669489 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" path="/var/lib/kubelet/pods/24dda508-a2ca-4338-bff6-5989f30bc584/volumes" Apr 16 18:39:26.519686 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:26.519642 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:39:36.519884 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:36.519839 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:39:46.519979 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:46.519922 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:39:56.519595 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:39:56.519501 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 18:40:06.521327 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:40:06.521293 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:42:37.629890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:42:37.629861 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:42:37.632787 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:42:37.632763 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:47:37.655986 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:37.655955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:47:37.659168 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:37.659143 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:47:50.210452 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.210414 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:47:50.210842 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.210719 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" containerID="cri-o://6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480" gracePeriod=30 Apr 16 18:47:50.273707 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.273670 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:47:50.274152 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.274134 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" Apr 16 18:47:50.274245 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.274155 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" Apr 16 18:47:50.274310 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.274247 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="24dda508-a2ca-4338-bff6-5989f30bc584" containerName="kserve-container" Apr 16 18:47:50.277228 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.277206 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:47:50.286813 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.286795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:47:50.293563 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.293535 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:47:50.423866 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.423837 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:47:50.425249 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:47:50.425216 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab7994f_ff6b_424b_aeed_96a55fecf97a.slice/crio-9cab63454f050513bc5f9576a8862901a2c160e6ef4574efda5578f5c15e9f18 WatchSource:0}: Error finding container 9cab63454f050513bc5f9576a8862901a2c160e6ef4574efda5578f5c15e9f18: Status 404 returned error can't find the container with id 9cab63454f050513bc5f9576a8862901a2c160e6ef4574efda5578f5c15e9f18 Apr 16 18:47:50.427114 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.427092 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:47:50.996459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.996413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" event={"ID":"bab7994f-ff6b-424b-aeed-96a55fecf97a","Type":"ContainerStarted","Data":"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37"} Apr 16 18:47:50.996459 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.996458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" event={"ID":"bab7994f-ff6b-424b-aeed-96a55fecf97a","Type":"ContainerStarted","Data":"9cab63454f050513bc5f9576a8862901a2c160e6ef4574efda5578f5c15e9f18"} Apr 16 18:47:50.996715 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.996582 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:47:50.998090 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:50.998058 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:47:51.012367 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:51.012320 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podStartSLOduration=1.012307959 podStartE2EDuration="1.012307959s" podCreationTimestamp="2026-04-16 18:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:51.011268607 +0000 UTC m=+1813.918848635" watchObservedRunningTime="2026-04-16 18:47:51.012307959 +0000 UTC m=+1813.919887985" Apr 16 18:47:51.999489 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:51.999444 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:47:53.357624 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:53.357593 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:47:54.006362 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.006331 2579 generic.go:358] "Generic (PLEG): container finished" podID="d97424df-8707-43bc-8e27-47c356e78bc4" containerID="6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480" exitCode=0 Apr 16 18:47:54.006560 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.006415 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" Apr 16 18:47:54.006560 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.006422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" event={"ID":"d97424df-8707-43bc-8e27-47c356e78bc4","Type":"ContainerDied","Data":"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480"} Apr 16 18:47:54.006560 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.006465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9" event={"ID":"d97424df-8707-43bc-8e27-47c356e78bc4","Type":"ContainerDied","Data":"af876f9c057a35226bec28d7153fac99ad73815a5d1858626344e2a59294a3f7"} Apr 16 18:47:54.006560 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.006486 2579 scope.go:117] "RemoveContainer" containerID="6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480" Apr 16 18:47:54.014480 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.014456 2579 scope.go:117] "RemoveContainer" containerID="6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480" Apr 16 18:47:54.014767 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:47:54.014740 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480\": container with ID starting with 6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480 not found: ID does not exist" containerID="6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480" Apr 16 18:47:54.014819 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.014770 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480"} err="failed to get container status \"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480\": rpc error: code = NotFound desc = could not find container \"6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480\": container with ID starting with 6c4350df39cac686a51417b0c8be28e09da3cc95762cfbff0e51cb9a0529e480 not found: ID does not exist" Apr 16 18:47:54.024485 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.024456 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:47:54.031101 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:54.031076 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-976d9-predictor-85bdbdc997-7jqm9"] Apr 16 18:47:55.668736 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:47:55.668703 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" path="/var/lib/kubelet/pods/d97424df-8707-43bc-8e27-47c356e78bc4/volumes" Apr 16 18:48:02.000103 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:02.000061 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:48:12.000092 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:12.000037 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:48:22.000229 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:22.000177 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:48:31.999888 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:31.999837 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:48:39.773575 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.773532 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 18:48:39.773952 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.773935 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" Apr 16 18:48:39.773996 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.773963 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" Apr 16 18:48:39.774064 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.774051 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97424df-8707-43bc-8e27-47c356e78bc4" containerName="kserve-container" Apr 16 18:48:39.777119 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.777098 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:48:39.777341 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.777318 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 18:48:39.777430 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.777324 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" containerID="cri-o://6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe" gracePeriod=30 Apr 16 18:48:39.783464 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.783434 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 18:48:39.787873 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.787848 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 18:48:39.918838 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:39.918746 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 18:48:39.921538 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:48:39.921497 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db8cb60_43e1_4046_b634_86ce0f17c5d5.slice/crio-ce5f9338db2eaef86b11b845381f9b01de6b7c5e0087223460fc2bc0d6754ddd WatchSource:0}: Error finding container ce5f9338db2eaef86b11b845381f9b01de6b7c5e0087223460fc2bc0d6754ddd: Status 404 returned error can't find the container with id ce5f9338db2eaef86b11b845381f9b01de6b7c5e0087223460fc2bc0d6754ddd Apr 16 18:48:40.141901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:40.141812 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" event={"ID":"5db8cb60-43e1-4046-b634-86ce0f17c5d5","Type":"ContainerStarted","Data":"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42"} Apr 16 18:48:40.141901 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:40.141848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" event={"ID":"5db8cb60-43e1-4046-b634-86ce0f17c5d5","Type":"ContainerStarted","Data":"ce5f9338db2eaef86b11b845381f9b01de6b7c5e0087223460fc2bc0d6754ddd"} Apr 16 18:48:40.142132 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:40.142045 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 18:48:40.143534 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:40.143504 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:48:40.160605 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:40.160551 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podStartSLOduration=1.160535526 podStartE2EDuration="1.160535526s" podCreationTimestamp="2026-04-16 18:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:48:40.159200349 +0000 UTC m=+1863.066780378" watchObservedRunningTime="2026-04-16 18:48:40.160535526 +0000 UTC m=+1863.068115558" Apr 16 18:48:41.145104 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:41.145062 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:48:42.001581 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:42.001541 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:48:43.021490 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.021463 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:48:43.151484 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.151407 2579 generic.go:358] "Generic (PLEG): container finished" podID="0e13a8de-484c-49a5-afbc-663232b15c46" containerID="6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe" exitCode=0 Apr 16 18:48:43.151623 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.151479 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" Apr 16 18:48:43.151623 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.151483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" event={"ID":"0e13a8de-484c-49a5-afbc-663232b15c46","Type":"ContainerDied","Data":"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe"} Apr 16 18:48:43.151623 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.151521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66" event={"ID":"0e13a8de-484c-49a5-afbc-663232b15c46","Type":"ContainerDied","Data":"a18406b26db658739b9fbeee68e6b20d2832b94f806613d1f70e14af40d7ab96"} Apr 16 18:48:43.151623 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.151536 2579 scope.go:117] "RemoveContainer" containerID="6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe" Apr 16 18:48:43.159320 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.159298 2579 scope.go:117] "RemoveContainer" containerID="6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe" Apr 16 18:48:43.159623 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:48:43.159601 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe\": container with ID starting with 6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe not found: ID does not exist" containerID="6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe" Apr 16 18:48:43.159702 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.159636 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe"} err="failed to get container status \"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe\": rpc error: code = NotFound desc = could not find container \"6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe\": container with ID starting with 6df643747397869e456d130dc9266c7c27b47b9918cdd6cb4e2b71378ff96cbe not found: ID does not exist" Apr 16 18:48:43.172612 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.172584 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:48:43.176865 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.176832 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5aeb1-predictor-6985d9467-7ss66"] Apr 16 18:48:43.668929 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:43.668896 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" path="/var/lib/kubelet/pods/0e13a8de-484c-49a5-afbc-663232b15c46/volumes" Apr 16 18:48:51.145866 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:48:51.145816 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:49:01.145155 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:01.145105 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:49:10.531820 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.531781 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:49:10.532197 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.532018 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" containerID="cri-o://5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37" gracePeriod=30 Apr 16 18:49:10.534618 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.534593 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:49:10.534900 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.534887 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" Apr 16 18:49:10.534965 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.534902 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" Apr 16 18:49:10.535008 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.534973 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e13a8de-484c-49a5-afbc-663232b15c46" containerName="kserve-container" Apr 16 18:49:10.538050 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.538031 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:49:10.546544 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.546521 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:49:10.548562 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.548541 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:49:10.685368 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:10.685341 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:49:10.688269 ip-10-0-138-88 kubenswrapper[2579]: W0416 18:49:10.688229 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d71340_9fb6_4558_8eb1_3bf5fcadbba7.slice/crio-71ab7249fce7813a3e007001d68c3c7528821842f7fb4fc43251dfa9d858a923 WatchSource:0}: Error finding container 71ab7249fce7813a3e007001d68c3c7528821842f7fb4fc43251dfa9d858a923: Status 404 returned error can't find the container with id 71ab7249fce7813a3e007001d68c3c7528821842f7fb4fc43251dfa9d858a923 Apr 16 18:49:11.145530 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.145434 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:49:11.237975 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.237934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" event={"ID":"78d71340-9fb6-4558-8eb1-3bf5fcadbba7","Type":"ContainerStarted","Data":"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b"} Apr 16 18:49:11.237975 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.237976 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" event={"ID":"78d71340-9fb6-4558-8eb1-3bf5fcadbba7","Type":"ContainerStarted","Data":"71ab7249fce7813a3e007001d68c3c7528821842f7fb4fc43251dfa9d858a923"} Apr 16 18:49:11.238181 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.238090 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:49:11.239285 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.239261 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:49:11.255130 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:11.255083 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podStartSLOduration=1.2550675519999999 podStartE2EDuration="1.255067552s" podCreationTimestamp="2026-04-16 18:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:11.253389741 +0000 UTC m=+1894.160969782" watchObservedRunningTime="2026-04-16 18:49:11.255067552 +0000 UTC m=+1894.162647658" Apr 16 18:49:12.000139 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:12.000087 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 16 18:49:12.241470 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:12.241435 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:49:13.674502 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:13.674481 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:49:14.251085 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.251046 2579 generic.go:358] "Generic (PLEG): container finished" podID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerID="5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37" exitCode=0 Apr 16 18:49:14.251259 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.251140 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" Apr 16 18:49:14.251259 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.251132 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" event={"ID":"bab7994f-ff6b-424b-aeed-96a55fecf97a","Type":"ContainerDied","Data":"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37"} Apr 16 18:49:14.251259 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.251243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd" event={"ID":"bab7994f-ff6b-424b-aeed-96a55fecf97a","Type":"ContainerDied","Data":"9cab63454f050513bc5f9576a8862901a2c160e6ef4574efda5578f5c15e9f18"} Apr 16 18:49:14.251259 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.251259 2579 scope.go:117] "RemoveContainer" containerID="5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37" Apr 16 18:49:14.260042 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.260015 2579 scope.go:117] "RemoveContainer" containerID="5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37" Apr 16 18:49:14.260301 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:49:14.260279 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37\": container with ID starting with 5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37 not found: ID does not exist" containerID="5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37" Apr 16 18:49:14.260381 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.260314 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37"} err="failed to get container status \"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37\": rpc error: code = NotFound desc = could not find container \"5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37\": container with ID starting with 5e34a17797cbc6d6e791d0cabbd29cad2875b8f62bfc7d6fca77d04ac6e95b37 not found: ID does not exist" Apr 16 18:49:14.273632 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.273608 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:49:14.277706 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:14.277677 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ebde8-predictor-67b74b47f-mdvsd"] Apr 16 18:49:15.669513 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:15.669483 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" path="/var/lib/kubelet/pods/bab7994f-ff6b-424b-aeed-96a55fecf97a/volumes" Apr 16 18:49:21.146125 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:21.146079 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 18:49:22.242460 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:22.242418 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:49:31.146342 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:31.146303 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 18:49:32.242154 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:32.242113 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:49:42.242293 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:42.242246 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:49:52.242056 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:49:52.242001 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 18:50:02.243366 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:50:02.243330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:52:37.679931 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:52:37.679904 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:52:37.685049 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:52:37.685015 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:57:37.700969 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:57:37.700857 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:57:37.706146 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:57:37.706123 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 18:58:35.418990 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:35.418912 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:58:35.419489 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:35.419172 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" containerID="cri-o://aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b" gracePeriod=30 Apr 16 18:58:38.554258 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.554231 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:58:38.898890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.898853 2579 generic.go:358] "Generic (PLEG): container finished" podID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerID="aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b" exitCode=0 Apr 16 18:58:38.898890 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.898893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" event={"ID":"78d71340-9fb6-4558-8eb1-3bf5fcadbba7","Type":"ContainerDied","Data":"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b"} Apr 16 18:58:38.899105 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.898910 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" Apr 16 18:58:38.899105 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.898925 2579 scope.go:117] "RemoveContainer" containerID="aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b" Apr 16 18:58:38.899105 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.898916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g" event={"ID":"78d71340-9fb6-4558-8eb1-3bf5fcadbba7","Type":"ContainerDied","Data":"71ab7249fce7813a3e007001d68c3c7528821842f7fb4fc43251dfa9d858a923"} Apr 16 18:58:38.906658 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.906636 2579 scope.go:117] "RemoveContainer" containerID="aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b" Apr 16 18:58:38.906951 ip-10-0-138-88 kubenswrapper[2579]: E0416 18:58:38.906928 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b\": container with ID starting with aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b not found: ID does not exist" containerID="aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b" Apr 16 18:58:38.907030 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.906958 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b"} err="failed to get container status \"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b\": rpc error: code = NotFound desc = could not find container \"aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b\": container with ID starting with aee1c91753517d139145c1cae364373805d05f22aa0fef8c588b91292b97544b not found: ID does not exist" Apr 16 18:58:38.920219 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.920192 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:58:38.924144 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:38.924123 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-56ec1-predictor-785d9fcb47-mml5g"] Apr 16 18:58:39.669531 ip-10-0-138-88 kubenswrapper[2579]: I0416 18:58:39.669498 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" path="/var/lib/kubelet/pods/78d71340-9fb6-4558-8eb1-3bf5fcadbba7/volumes" Apr 16 19:02:37.722085 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:02:37.721983 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 19:02:37.730094 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:02:37.730067 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 19:06:09.491819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:09.491780 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 19:06:09.492387 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:09.492092 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" containerID="cri-o://164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42" gracePeriod=30 Apr 16 19:06:10.487857 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.487823 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-t8hlc/must-gather-gr242"] Apr 16 19:06:10.488165 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488152 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" Apr 16 19:06:10.488165 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488166 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" Apr 16 19:06:10.488259 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488178 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" Apr 16 19:06:10.488259 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488184 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" Apr 16 19:06:10.488259 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488230 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bab7994f-ff6b-424b-aeed-96a55fecf97a" containerName="kserve-container" Apr 16 19:06:10.488259 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.488240 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="78d71340-9fb6-4558-8eb1-3bf5fcadbba7" containerName="kserve-container" Apr 16 19:06:10.491321 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.491301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.493642 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.493620 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t8hlc\"/\"kube-root-ca.crt\"" Apr 16 19:06:10.493945 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.493703 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-t8hlc\"/\"openshift-service-ca.crt\"" Apr 16 19:06:10.494474 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.494453 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-t8hlc\"/\"default-dockercfg-5ckjl\"" Apr 16 19:06:10.512128 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.512096 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t8hlc/must-gather-gr242"] Apr 16 19:06:10.604564 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.604503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.604757 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.604590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm2xn\" (UniqueName: \"kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.705185 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.705155 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm2xn\" (UniqueName: \"kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.705380 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.705217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.705560 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.705541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.715136 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.715101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm2xn\" (UniqueName: \"kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn\") pod \"must-gather-gr242\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.810080 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.810047 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:10.931025 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.930995 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-t8hlc/must-gather-gr242"] Apr 16 19:06:10.934337 ip-10-0-138-88 kubenswrapper[2579]: W0416 19:06:10.934310 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eb473e7_8229_48bf_85e2_69b2c00ac1c3.slice/crio-e812020ac3e6410bbddd155a2227378117253baa147f18bfaf7060057eb1e85d WatchSource:0}: Error finding container e812020ac3e6410bbddd155a2227378117253baa147f18bfaf7060057eb1e85d: Status 404 returned error can't find the container with id e812020ac3e6410bbddd155a2227378117253baa147f18bfaf7060057eb1e85d Apr 16 19:06:10.936288 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:10.936261 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:06:11.145281 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:11.145188 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 19:06:11.231320 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:11.231281 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8hlc/must-gather-gr242" event={"ID":"2eb473e7-8229-48bf-85e2-69b2c00ac1c3","Type":"ContainerStarted","Data":"e812020ac3e6410bbddd155a2227378117253baa147f18bfaf7060057eb1e85d"} Apr 16 19:06:12.961611 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:12.961549 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 19:06:13.239864 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.239828 2579 generic.go:358] "Generic (PLEG): container finished" podID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerID="164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42" exitCode=0 Apr 16 19:06:13.240036 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.239904 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" Apr 16 19:06:13.240036 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.239916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" event={"ID":"5db8cb60-43e1-4046-b634-86ce0f17c5d5","Type":"ContainerDied","Data":"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42"} Apr 16 19:06:13.240036 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.239957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc" event={"ID":"5db8cb60-43e1-4046-b634-86ce0f17c5d5","Type":"ContainerDied","Data":"ce5f9338db2eaef86b11b845381f9b01de6b7c5e0087223460fc2bc0d6754ddd"} Apr 16 19:06:13.240036 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.239975 2579 scope.go:117] "RemoveContainer" containerID="164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42" Apr 16 19:06:13.265670 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.265638 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 19:06:13.269408 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.269368 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-979d7-predictor-7fc6dfb459-cs6wc"] Apr 16 19:06:13.669521 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:13.669442 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" path="/var/lib/kubelet/pods/5db8cb60-43e1-4046-b634-86ce0f17c5d5/volumes" Apr 16 19:06:15.065960 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:15.065747 2579 scope.go:117] "RemoveContainer" containerID="164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42" Apr 16 19:06:15.066219 ip-10-0-138-88 kubenswrapper[2579]: E0416 19:06:15.066094 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42\": container with ID starting with 164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42 not found: ID does not exist" containerID="164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42" Apr 16 19:06:15.066219 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:15.066140 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42"} err="failed to get container status \"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42\": rpc error: code = NotFound desc = could not find container \"164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42\": container with ID starting with 164924cefc962480b5ba046a9efa3c8eb8cd7ce4742ad60f426b9e0eb1880b42 not found: ID does not exist" Apr 16 19:06:16.252458 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:16.252422 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8hlc/must-gather-gr242" event={"ID":"2eb473e7-8229-48bf-85e2-69b2c00ac1c3","Type":"ContainerStarted","Data":"06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8"} Apr 16 19:06:16.252868 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:16.252465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8hlc/must-gather-gr242" event={"ID":"2eb473e7-8229-48bf-85e2-69b2c00ac1c3","Type":"ContainerStarted","Data":"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba"} Apr 16 19:06:16.272149 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:16.272094 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-t8hlc/must-gather-gr242" podStartSLOduration=1.868356533 podStartE2EDuration="6.272078994s" podCreationTimestamp="2026-04-16 19:06:10 +0000 UTC" firstStartedPulling="2026-04-16 19:06:10.93638358 +0000 UTC m=+2913.843963586" lastFinishedPulling="2026-04-16 19:06:15.340106041 +0000 UTC m=+2918.247686047" observedRunningTime="2026-04-16 19:06:16.269996605 +0000 UTC m=+2919.177576634" watchObservedRunningTime="2026-04-16 19:06:16.272078994 +0000 UTC m=+2919.179659056" Apr 16 19:06:34.310892 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:34.310854 2579 generic.go:358] "Generic (PLEG): container finished" podID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerID="4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba" exitCode=0 Apr 16 19:06:34.311300 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:34.310928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-t8hlc/must-gather-gr242" event={"ID":"2eb473e7-8229-48bf-85e2-69b2c00ac1c3","Type":"ContainerDied","Data":"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba"} Apr 16 19:06:34.311300 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:34.311280 2579 scope.go:117] "RemoveContainer" containerID="4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba" Apr 16 19:06:34.693136 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:34.693059 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8hlc_must-gather-gr242_2eb473e7-8229-48bf-85e2-69b2c00ac1c3/gather/0.log" Apr 16 19:06:38.188090 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:38.188005 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nl9sg_d0db8fbf-eebe-4eb1-84d2-ef97f04477fa/global-pull-secret-syncer/0.log" Apr 16 19:06:38.336771 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:38.336733 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z5qzj_d5e8b6a3-f88f-40cd-be49-7c8a4efe8164/konnectivity-agent/0.log" Apr 16 19:06:38.422664 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:38.422635 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-88.ec2.internal_b69b580532cc9a5ba4501f776b0e392e/haproxy/0.log" Apr 16 19:06:40.098915 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.098875 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-t8hlc/must-gather-gr242"] Apr 16 19:06:40.099513 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.099178 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-t8hlc/must-gather-gr242" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="copy" containerID="cri-o://06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8" gracePeriod=2 Apr 16 19:06:40.105639 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.105608 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-t8hlc/must-gather-gr242"] Apr 16 19:06:40.320373 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.320347 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8hlc_must-gather-gr242_2eb473e7-8229-48bf-85e2-69b2c00ac1c3/copy/0.log" Apr 16 19:06:40.320792 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.320773 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:40.322663 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.322639 2579 status_manager.go:895] "Failed to get status for pod" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" pod="openshift-must-gather-t8hlc/must-gather-gr242" err="pods \"must-gather-gr242\" is forbidden: User \"system:node:ip-10-0-138-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t8hlc\": no relationship found between node 'ip-10-0-138-88.ec2.internal' and this object" Apr 16 19:06:40.329091 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.329076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-t8hlc_must-gather-gr242_2eb473e7-8229-48bf-85e2-69b2c00ac1c3/copy/0.log" Apr 16 19:06:40.329372 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.329353 2579 generic.go:358] "Generic (PLEG): container finished" podID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerID="06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8" exitCode=143 Apr 16 19:06:40.329466 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.329413 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-t8hlc/must-gather-gr242" Apr 16 19:06:40.329466 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.329448 2579 scope.go:117] "RemoveContainer" containerID="06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8" Apr 16 19:06:40.331290 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.331272 2579 status_manager.go:895] "Failed to get status for pod" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" pod="openshift-must-gather-t8hlc/must-gather-gr242" err="pods \"must-gather-gr242\" is forbidden: User \"system:node:ip-10-0-138-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t8hlc\": no relationship found between node 'ip-10-0-138-88.ec2.internal' and this object" Apr 16 19:06:40.336319 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.336303 2579 scope.go:117] "RemoveContainer" containerID="4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba" Apr 16 19:06:40.348450 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.348417 2579 scope.go:117] "RemoveContainer" containerID="06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8" Apr 16 19:06:40.348727 ip-10-0-138-88 kubenswrapper[2579]: E0416 19:06:40.348705 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8\": container with ID starting with 06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8 not found: ID does not exist" containerID="06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8" Apr 16 19:06:40.348790 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.348738 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8"} err="failed to get container status \"06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8\": rpc error: code = NotFound desc = could not find container \"06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8\": container with ID starting with 06b50589a4ab3a0e6d93ad9e617ad171723520d21e9e9ce9a6cd72f2618029c8 not found: ID does not exist" Apr 16 19:06:40.348790 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.348756 2579 scope.go:117] "RemoveContainer" containerID="4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba" Apr 16 19:06:40.349019 ip-10-0-138-88 kubenswrapper[2579]: E0416 19:06:40.348978 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba\": container with ID starting with 4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba not found: ID does not exist" containerID="4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba" Apr 16 19:06:40.349019 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.349005 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba"} err="failed to get container status \"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba\": rpc error: code = NotFound desc = could not find container \"4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba\": container with ID starting with 4e9783a759d32b24d2e344e10298d9a9c8f2bf527b7de7bf9cf0a224e21d0dba not found: ID does not exist" Apr 16 19:06:40.363537 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.363516 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm2xn\" (UniqueName: \"kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn\") pod \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " Apr 16 19:06:40.363647 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.363595 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output\") pod \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\" (UID: \"2eb473e7-8229-48bf-85e2-69b2c00ac1c3\") " Apr 16 19:06:40.364917 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.364893 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2eb473e7-8229-48bf-85e2-69b2c00ac1c3" (UID: "2eb473e7-8229-48bf-85e2-69b2c00ac1c3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:06:40.365608 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.365585 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn" (OuterVolumeSpecName: "kube-api-access-lm2xn") pod "2eb473e7-8229-48bf-85e2-69b2c00ac1c3" (UID: "2eb473e7-8229-48bf-85e2-69b2c00ac1c3"). InnerVolumeSpecName "kube-api-access-lm2xn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:06:40.464187 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.464136 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lm2xn\" (UniqueName: \"kubernetes.io/projected/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-kube-api-access-lm2xn\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 19:06:40.464187 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.464178 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2eb473e7-8229-48bf-85e2-69b2c00ac1c3-must-gather-output\") on node \"ip-10-0-138-88.ec2.internal\" DevicePath \"\"" Apr 16 19:06:40.639788 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:40.639760 2579 status_manager.go:895] "Failed to get status for pod" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" pod="openshift-must-gather-t8hlc/must-gather-gr242" err="pods \"must-gather-gr242\" is forbidden: User \"system:node:ip-10-0-138-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-t8hlc\": no relationship found between node 'ip-10-0-138-88.ec2.internal' and this object" Apr 16 19:06:41.668501 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:41.668460 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" path="/var/lib/kubelet/pods/2eb473e7-8229-48bf-85e2-69b2c00ac1c3/volumes" Apr 16 19:06:41.908412 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:41.908376 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rhbqn_69f60c16-14f7-41f5-af8f-f30635d5ef32/kube-state-metrics/0.log" Apr 16 19:06:41.936255 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:41.936161 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rhbqn_69f60c16-14f7-41f5-af8f-f30635d5ef32/kube-rbac-proxy-main/0.log" Apr 16 19:06:41.962472 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:41.962446 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rhbqn_69f60c16-14f7-41f5-af8f-f30635d5ef32/kube-rbac-proxy-self/0.log" Apr 16 19:06:41.993429 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:41.993378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-886754884-tkmnd_929b560f-9d1d-4d0b-be84-3095c605bb4c/metrics-server/0.log" Apr 16 19:06:42.226183 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.226101 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8vj7t_f0ad478e-fb4e-41fc-8942-d15c685f82b4/node-exporter/0.log" Apr 16 19:06:42.251809 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.251776 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8vj7t_f0ad478e-fb4e-41fc-8942-d15c685f82b4/kube-rbac-proxy/0.log" Apr 16 19:06:42.276764 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.276737 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8vj7t_f0ad478e-fb4e-41fc-8942-d15c685f82b4/init-textfile/0.log" Apr 16 19:06:42.415510 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.415472 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/prometheus/0.log" Apr 16 19:06:42.434640 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.434612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/config-reloader/0.log" Apr 16 19:06:42.461232 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.461204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/thanos-sidecar/0.log" Apr 16 19:06:42.488270 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.488241 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/kube-rbac-proxy-web/0.log" Apr 16 19:06:42.516098 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.516072 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/kube-rbac-proxy/0.log" Apr 16 19:06:42.541265 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.541230 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/kube-rbac-proxy-thanos/0.log" Apr 16 19:06:42.570043 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.570010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_77eb8613-8af8-4e1e-89ff-97a2d8a2b3ce/init-config-reloader/0.log" Apr 16 19:06:42.604062 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.604035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-8s6wd_e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82/prometheus-operator/0.log" Apr 16 19:06:42.626507 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:42.626457 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-8s6wd_e55f90a5-c3c9-46e3-9ca2-8ce6b1413f82/kube-rbac-proxy/0.log" Apr 16 19:06:44.443992 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.443966 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/2.log" Apr 16 19:06:44.450788 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.450756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-hjml7_f3230a09-30be-4152-ac36-65d7911245a2/console-operator/3.log" Apr 16 19:06:44.661065 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661030 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g"] Apr 16 19:06:44.661324 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661310 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="gather" Apr 16 19:06:44.661371 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661325 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="gather" Apr 16 19:06:44.661371 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661339 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" Apr 16 19:06:44.661371 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661344 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" Apr 16 19:06:44.661371 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661356 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="copy" Apr 16 19:06:44.661371 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661362 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="copy" Apr 16 19:06:44.661531 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661425 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5db8cb60-43e1-4046-b634-86ce0f17c5d5" containerName="kserve-container" Apr 16 19:06:44.661531 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661435 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="copy" Apr 16 19:06:44.661531 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.661444 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2eb473e7-8229-48bf-85e2-69b2c00ac1c3" containerName="gather" Apr 16 19:06:44.666593 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.666569 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.668974 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.668948 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"openshift-service-ca.crt\"" Apr 16 19:06:44.668974 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.668962 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6xqvt\"/\"kube-root-ca.crt\"" Apr 16 19:06:44.669540 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.669516 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6xqvt\"/\"default-dockercfg-4xt97\"" Apr 16 19:06:44.674904 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.674879 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g"] Apr 16 19:06:44.700046 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.699956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-proc\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.700046 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.700032 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfr7z\" (UniqueName: \"kubernetes.io/projected/98b45834-119b-430e-a9e1-8e5c57a48f17-kube-api-access-nfr7z\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.700230 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.700142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-lib-modules\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.700230 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.700171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-podres\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.700230 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.700211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-sys\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801617 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-lib-modules\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801617 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-podres\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-sys\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801727 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-podres\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801736 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-lib-modules\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-sys\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801819 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-proc\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801968 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfr7z\" (UniqueName: \"kubernetes.io/projected/98b45834-119b-430e-a9e1-8e5c57a48f17-kube-api-access-nfr7z\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.801968 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.801875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/98b45834-119b-430e-a9e1-8e5c57a48f17-proc\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.816082 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.816055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfr7z\" (UniqueName: \"kubernetes.io/projected/98b45834-119b-430e-a9e1-8e5c57a48f17-kube-api-access-nfr7z\") pod \"perf-node-gather-daemonset-2h45g\" (UID: \"98b45834-119b-430e-a9e1-8e5c57a48f17\") " pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:44.977760 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:44.977662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:45.102419 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.102361 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g"] Apr 16 19:06:45.105754 ip-10-0-138-88 kubenswrapper[2579]: W0416 19:06:45.105724 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod98b45834_119b_430e_a9e1_8e5c57a48f17.slice/crio-80ccb89185da7442cec92089426f8346951b8d5345483bdc61c8ca88cef06f79 WatchSource:0}: Error finding container 80ccb89185da7442cec92089426f8346951b8d5345483bdc61c8ca88cef06f79: Status 404 returned error can't find the container with id 80ccb89185da7442cec92089426f8346951b8d5345483bdc61c8ca88cef06f79 Apr 16 19:06:45.339062 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.339029 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-rdhmq_13962cee-2aff-4652-a7e9-530d3125242a/volume-data-source-validator/0.log" Apr 16 19:06:45.346816 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.346782 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" event={"ID":"98b45834-119b-430e-a9e1-8e5c57a48f17","Type":"ContainerStarted","Data":"4f5ce5a8a5a5ac0058112bef135ad37bad73fbb86cc7a8f04f035241666ed5e7"} Apr 16 19:06:45.346816 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.346815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" event={"ID":"98b45834-119b-430e-a9e1-8e5c57a48f17","Type":"ContainerStarted","Data":"80ccb89185da7442cec92089426f8346951b8d5345483bdc61c8ca88cef06f79"} Apr 16 19:06:45.346997 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.346886 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:45.364319 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:45.364275 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" podStartSLOduration=1.364260693 podStartE2EDuration="1.364260693s" podCreationTimestamp="2026-04-16 19:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:06:45.363567062 +0000 UTC m=+2948.271147092" watchObservedRunningTime="2026-04-16 19:06:45.364260693 +0000 UTC m=+2948.271840720" Apr 16 19:06:46.148903 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:46.148875 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mh6jk_15e80e92-2f17-4ce6-a0cc-2073e197b9c2/dns/0.log" Apr 16 19:06:46.175331 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:46.175282 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mh6jk_15e80e92-2f17-4ce6-a0cc-2073e197b9c2/kube-rbac-proxy/0.log" Apr 16 19:06:46.253850 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:46.253816 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sfjlw_0c49a704-d49e-48e8-a3bf-2cbbf59da5a5/dns-node-resolver/0.log" Apr 16 19:06:46.686537 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:46.686501 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-57dcf45987-66g2j_24dc120e-24c7-4a98-a3ca-c4e002937a7b/registry/0.log" Apr 16 19:06:46.710859 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:46.710828 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g2bjm_a3216a58-ad89-4814-b5b4-7ae5bf98510e/node-ca/0.log" Apr 16 19:06:47.995429 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:47.995381 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t9hjb_d06241a9-55bd-4260-9855-06114156f4d2/serve-healthcheck-canary/0.log" Apr 16 19:06:48.432484 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:48.432388 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-6tvbf_f8f3ed9b-bbc2-445b-9380-064dc07640d5/insights-operator/0.log" Apr 16 19:06:48.433647 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:48.433620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-6tvbf_f8f3ed9b-bbc2-445b-9380-064dc07640d5/insights-operator/1.log" Apr 16 19:06:48.627560 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:48.627528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lss66_a4466154-2f02-4daf-a450-7d13e7879820/kube-rbac-proxy/0.log" Apr 16 19:06:48.669438 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:48.669408 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lss66_a4466154-2f02-4daf-a450-7d13e7879820/exporter/0.log" Apr 16 19:06:48.718603 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:48.718519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lss66_a4466154-2f02-4daf-a450-7d13e7879820/extractor/0.log" Apr 16 19:06:51.236352 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:51.236314 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-mwlxs_a25d125b-34f0-4d26-ac87-3da371639f2d/manager/0.log" Apr 16 19:06:51.257863 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:51.257829 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-zs26v_c3cd2a35-0050-49f4-a252-c823aa4a04a8/s3-init/0.log" Apr 16 19:06:51.359537 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:51.359509 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6xqvt/perf-node-gather-daemonset-2h45g" Apr 16 19:06:56.780693 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.780614 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5gncp_5eb4032a-87eb-4b5c-955d-2f54c02cd4de/kube-multus/0.log" Apr 16 19:06:56.852523 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.852492 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/kube-multus-additional-cni-plugins/0.log" Apr 16 19:06:56.877331 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.877303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/egress-router-binary-copy/0.log" Apr 16 19:06:56.902226 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.902200 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/cni-plugins/0.log" Apr 16 19:06:56.927321 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.927289 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/bond-cni-plugin/0.log" Apr 16 19:06:56.953235 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.953206 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/routeoverride-cni/0.log" Apr 16 19:06:56.977550 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:56.977521 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/whereabouts-cni-bincopy/0.log" Apr 16 19:06:57.002644 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:57.002612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kx5hq_9ff1abd4-4c73-4895-854f-6aa240273e76/whereabouts-cni/0.log" Apr 16 19:06:57.486317 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:57.486281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h2jgj_a84444e3-6cab-4290-a61c-c01132150e31/network-metrics-daemon/0.log" Apr 16 19:06:57.515281 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:57.515254 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h2jgj_a84444e3-6cab-4290-a61c-c01132150e31/kube-rbac-proxy/0.log" Apr 16 19:06:58.740747 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.740711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/ovn-controller/0.log" Apr 16 19:06:58.788543 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.788497 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/ovn-acl-logging/0.log" Apr 16 19:06:58.813245 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.813212 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/kube-rbac-proxy-node/0.log" Apr 16 19:06:58.837808 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.837780 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:06:58.858944 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.858913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/northd/0.log" Apr 16 19:06:58.882310 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.882283 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/nbdb/0.log" Apr 16 19:06:58.905570 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:58.905540 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/sbdb/0.log" Apr 16 19:06:59.072666 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:06:59.072587 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zkthf_e7b29382-768b-4aa5-a896-07f32fc4d4e6/ovnkube-controller/0.log" Apr 16 19:07:00.415890 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:07:00.415864 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-rlbq9_96b64fed-ad15-4e9c-b012-f7c3cac2fcb4/check-endpoints/0.log" Apr 16 19:07:00.446743 ip-10-0-138-88 kubenswrapper[2579]: I0416 19:07:00.446711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8wxmz_afad1e88-aa25-44f7-8893-4eac3477f6c8/network-check-target-container/0.log"