Apr 24 21:24:08.123440 ip-10-0-133-48 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:08.123455 ip-10-0-133-48 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:08.123464 ip-10-0-133-48 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:08.123998 ip-10-0-133-48 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:24:18.183015 ip-10-0-133-48 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:24:18.183029 ip-10-0-133-48 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 1a59a0439bd84aca9b89f03b43d8322f -- Apr 24 21:26:32.527649 ip-10-0-133-48 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:32.948410 ip-10-0-133-48 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:32.948410 ip-10-0-133-48 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:32.948410 ip-10-0-133-48 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:32.948410 ip-10-0-133-48 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:32.948410 ip-10-0-133-48 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:32.950576 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.950485 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957185 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957205 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957209 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957214 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957219 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957222 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:32.957213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957226 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957230 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957233 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957235 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957238 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957242 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957244 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957247 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957250 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957253 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957256 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957258 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957261 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957264 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957266 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957269 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957272 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957274 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957278 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957280 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:32.957502 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957283 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957286 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957288 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957291 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957293 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957296 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957299 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957301 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957304 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957307 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957309 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957312 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957314 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957318 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957320 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957323 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957326 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957342 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957344 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:32.957984 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957347 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957350 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957353 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957356 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957359 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957362 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957364 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957367 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957372 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957376 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957379 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957382 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957384 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957387 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957390 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957393 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957396 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957398 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957401 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957404 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:32.958507 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957407 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957409 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957412 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957414 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957417 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957419 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957422 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957425 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957429 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957432 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957434 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957437 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957440 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957444 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957447 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957450 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957453 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957456 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957458 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957461 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:32.958989 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957463 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957895 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957901 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957905 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957908 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957911 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957913 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957916 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957919 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957921 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957924 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957927 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957930 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957932 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957935 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957937 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957940 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957943 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957947 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:32.959503 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957950 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957953 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957955 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957958 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957960 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957963 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957967 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957969 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957972 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957974 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957977 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957980 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957982 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957985 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957988 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957990 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957993 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957996 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.957998 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958001 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:32.959958 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958003 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958006 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958009 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958012 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958014 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958017 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958019 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958022 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958024 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958027 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958029 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958032 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958035 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958038 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958040 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958043 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958046 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958049 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958051 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:32.960477 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958054 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958057 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958059 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958061 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958064 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958067 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958069 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958072 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958074 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958078 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958080 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958083 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958085 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958088 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958091 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958093 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958095 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958098 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958100 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958104 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:32.960943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958107 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958112 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958115 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958118 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958121 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958124 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958127 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958129 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.958132 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959501 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959513 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959527 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959531 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959537 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959540 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959545 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959549 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959553 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959556 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959559 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959563 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959566 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:32.961450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959569 2574 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959572 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959575 2574 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959579 2574 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959581 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959584 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959589 2574 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959591 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959595 2574 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959598 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959601 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959605 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959608 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959611 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959614 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959617 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959621 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959624 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959627 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959630 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959635 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959637 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959641 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959643 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959648 2574 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:32.961977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959651 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959656 2574 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959659 2574 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959662 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959666 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959668 2574 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959672 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959675 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959679 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959682 2574 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959686 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959689 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959692 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959695 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959698 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959701 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959704 2574 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959708 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959711 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959715 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959718 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959722 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959725 2574 flags.go:64] FLAG: --help="false" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959727 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-133-48.ec2.internal" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959731 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:32.962614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959734 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959737 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959740 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959743 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959747 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959749 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959752 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959756 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959759 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959762 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959765 2574 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959768 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959770 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959773 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959776 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959779 2574 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959781 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959784 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959787 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959793 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959796 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959799 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959802 2574 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:32.963265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959804 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959808 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959810 2574 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959813 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959818 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959821 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959826 2574 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959829 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959832 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959835 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959838 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959842 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959845 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959848 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959856 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959859 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959862 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959866 2574 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959870 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959876 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959878 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959882 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959884 2574 flags.go:64] FLAG: --port="10250" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959888 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:32.963856 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959891 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f2c64cd6381bb817" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959894 2574 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959897 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959900 2574 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959903 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959906 2574 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959909 2574 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959912 2574 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959915 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959920 2574 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959925 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959928 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959931 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959934 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959937 2574 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959940 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959943 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959946 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959949 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959952 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959955 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959958 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959961 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959964 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959967 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959970 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:32.964482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959974 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959978 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959981 2574 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959984 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959989 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959992 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959995 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.959999 2574 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960001 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960004 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960007 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960010 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960013 2574 flags.go:64] FLAG: --v="2" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960017 2574 flags.go:64] FLAG: --version="false" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960022 2574 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960027 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.960030 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960131 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960135 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960138 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960141 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960144 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960147 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:32.965118 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960150 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960153 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960156 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960158 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960161 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960163 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960166 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960168 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960171 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960173 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960177 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960179 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960182 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960184 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960187 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960189 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960192 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960194 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960197 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:32.965685 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960200 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960203 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960205 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960208 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960210 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960214 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960217 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960219 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960222 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960227 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960230 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960233 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960236 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960238 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960241 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960244 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960246 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960249 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960252 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960254 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:32.966189 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960257 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960260 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960262 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960264 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960267 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960272 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960276 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960278 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960281 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960284 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960288 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960292 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960294 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960297 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960300 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960302 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960305 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960309 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960311 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:32.966845 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960314 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960316 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960320 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960323 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960326 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960341 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960343 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960346 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960349 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960352 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960354 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960357 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960360 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960362 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960365 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960368 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960370 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960373 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960376 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960379 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:32.967319 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960382 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.960384 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.961033 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.967667 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.967781 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967833 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967838 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967842 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967844 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967848 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967851 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967854 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967857 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967860 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:32.967857 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967863 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967867 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967870 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967873 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967876 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967878 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967881 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967884 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967887 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967889 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967892 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967895 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967897 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967900 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967902 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967905 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967908 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967911 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967914 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:32.968220 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967917 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967919 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967922 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967924 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967927 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967929 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967932 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967934 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967937 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967940 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967942 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967944 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967947 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967949 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967952 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967955 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967958 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967961 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967964 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967966 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:32.968718 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967969 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967972 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967974 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967977 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967979 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967982 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967984 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967987 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967991 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.967996 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968001 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968004 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968007 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968010 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968013 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968015 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968017 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968020 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968023 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:32.969213 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968025 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968028 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968030 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968033 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968035 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968038 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968040 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968043 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968046 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968049 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968052 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968054 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968057 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968060 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968063 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968065 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968068 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968071 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:32.969716 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968073 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.968079 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968197 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968202 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968206 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968208 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968212 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968214 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968217 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968220 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968223 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968226 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968229 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968232 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968235 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:32.970184 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968238 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968242 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968246 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968249 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968252 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968255 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968258 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968261 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968264 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968267 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968269 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968272 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968274 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968277 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968279 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968282 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968284 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968287 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968290 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:32.970582 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968292 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968295 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968297 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968300 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968302 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968306 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968309 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968312 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968315 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968317 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968320 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968322 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968325 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968343 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968346 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968349 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968352 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968354 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968356 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968359 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:32.971059 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968362 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968364 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968367 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968370 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968372 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968375 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968377 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968380 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968382 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968385 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968387 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968390 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968392 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968395 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968397 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968400 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968403 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968405 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968409 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968411 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:32.971579 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968414 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968416 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968418 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968421 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968423 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968426 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968428 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968430 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968433 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968435 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968437 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968440 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968442 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:32.968445 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.968449 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:32.972065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.969239 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:32.973307 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.973292 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:32.974318 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.974307 2574 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:32.974380 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.974357 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:32.974413 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.974401 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:32.996934 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:32.996910 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:33.002653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.002621 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:33.018926 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.018904 2574 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:33.026284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.026264 2574 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:33.026766 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.026745 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:33.027771 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.027758 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:33.033669 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.033646 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9c5be39e-f2b0-4cda-ab16-facb957dabe3:/dev/nvme0n1p3 dfa8c530-d05a-4005-9705-5ecf6894b5fc:/dev/nvme0n1p4] Apr 24 21:26:33.033746 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.033668 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:33.039912 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.039795 2574 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:33.038078047 +0000 UTC m=+0.384984384 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100126 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec224ea96aa2b900de622f7b855c8a74 SystemUUID:ec224ea9-6aa2-b900-de62-2f7b855c8a74 BootID:1a59a043-9bd8-4aca-9b89-f03b43d8322f Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ce:af:e2:2b:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ce:af:e2:2b:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:8b:3e:1c:e8:52 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:33.039912 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.039905 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:33.040056 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.040043 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:33.041179 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.041151 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:33.041323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.041181 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-48.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:33.041383 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.041345 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:33.041383 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.041355 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:33.041383 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.041367 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:33.042086 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.042075 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:33.043680 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.043668 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:33.043815 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.043805 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:33.046132 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.046120 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:33.046178 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.046143 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:33.046178 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.046155 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:33.046178 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.046167 2574 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:33.046274 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.046179 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:33.047246 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.047234 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:33.047291 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.047254 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:33.050519 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.050504 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:33.052689 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.052673 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:33.054164 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054150 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054170 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054179 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054187 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054195 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054205 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054213 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054223 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:33.054234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054235 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:33.054518 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054244 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:33.054518 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054264 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:33.054518 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.054279 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:33.055132 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.055120 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:33.055184 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.055136 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:33.059040 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.059018 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-48.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:33.059144 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.059044 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:33.059439 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.059408 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:33.059543 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.059507 2574 server.go:1295] "Started kubelet" Apr 24 21:26:33.059599 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.059563 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:33.060042 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.060013 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:33.060521 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.060473 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:33.060591 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.060541 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:33.060574 ip-10-0-133-48 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:33.061999 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.061983 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:33.063120 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.063105 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:33.068069 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.067134 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-48.ec2.internal.18a9681afab637da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-48.ec2.internal,UID:ip-10-0-133-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-48.ec2.internal,},FirstTimestamp:2026-04-24 21:26:33.059055578 +0000 UTC m=+0.405961919,LastTimestamp:2026-04-24 21:26:33.059055578 +0000 UTC m=+0.405961919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-48.ec2.internal,}" Apr 24 21:26:33.070167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.070146 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:33.070167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.070163 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.070985 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071055 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071072 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071115 2574 factory.go:55] Registering systemd factory Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071130 2574 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071155 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:33.071160 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071166 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071351 2574 factory.go:153] Registering CRI-O factory Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071367 2574 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071457 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071483 2574 factory.go:103] Registering Raw factory Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071500 2574 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:33.071563 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.071553 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:33.071837 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.071636 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.071937 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.071924 2574 manager.go:319] Starting recovery of all containers Apr 24 21:26:33.073284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.073258 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zbmdf" Apr 24 21:26:33.081131 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.081104 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zbmdf" Apr 24 21:26:33.081379 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.081323 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:26:33.081879 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.081836 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:26:33.082122 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.082108 2574 manager.go:324] Recovery completed Apr 24 21:26:33.086714 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.086701 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.090106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090091 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.090166 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090120 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.090166 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090131 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.090754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090736 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:33.090754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090751 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:33.090906 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.090769 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:33.093761 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.093747 2574 policy_none.go:49] "None policy: Start" Apr 24 21:26:33.093842 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.093765 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:33.093842 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.093778 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:33.131045 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131023 2574 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.131071 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131085 2574 server.go:85] "Starting device plugin registration server" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131371 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131388 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131471 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131560 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.131568 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.132614 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:33.140223 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.132651 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.141150 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.141119 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:33.142253 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.142234 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:33.142319 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.142271 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:33.142319 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.142297 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:33.142319 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.142308 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:33.142447 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.142423 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:33.146620 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.146603 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:33.232466 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.232386 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.233519 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.233499 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.233519 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.233532 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.233697 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.233543 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.233697 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.233571 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.242531 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.242506 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal"] Apr 24 21:26:33.242648 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.242572 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.243405 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.243380 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.243491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.243410 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.243491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.243417 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.243491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.243431 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.243491 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.243429 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-48.ec2.internal\": node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.244629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.244617 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.244774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.244762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.244812 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.244789 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.245377 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245361 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.245469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245375 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.245469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245390 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.245469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245404 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.245469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245412 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.245469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.245427 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.246766 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.246749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.246846 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.246787 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:33.247461 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.247445 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:33.247533 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.247473 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:33.247533 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.247486 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:33.263990 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.263969 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.273275 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.273255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.273366 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.273283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.273366 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.273302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be2bfd82c265058eef2da37b1062af3f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-48.ec2.internal\" (UID: \"be2bfd82c265058eef2da37b1062af3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.278050 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.278034 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-48.ec2.internal\" not found" node="ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.282460 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.282441 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-48.ec2.internal\" not found" node="ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.364714 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.364682 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.374162 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.374162 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.374354 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be2bfd82c265058eef2da37b1062af3f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-48.ec2.internal\" (UID: \"be2bfd82c265058eef2da37b1062af3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.374354 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be2bfd82c265058eef2da37b1062af3f-config\") pod \"kube-apiserver-proxy-ip-10-0-133-48.ec2.internal\" (UID: \"be2bfd82c265058eef2da37b1062af3f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.374354 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.374354 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.374243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546f21f04c436cdaba8d6de0871ab03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal\" (UID: \"f546f21f04c436cdaba8d6de0871ab03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.465622 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.465588 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.566477 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.566399 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.580947 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.580915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.584780 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.584761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:33.667055 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.667013 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.767525 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.767500 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.868154 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.868068 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.968610 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:33.968568 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:33.973942 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.973921 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:33.974082 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:33.974067 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:34.068835 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.068811 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.070709 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.070690 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:34.081367 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.081344 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:34.083990 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.083963 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:33 +0000 UTC" deadline="2028-01-25 12:08:09.866162168 +0000 UTC" Apr 24 21:26:34.084051 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.083990 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15374h41m35.782174876s" Apr 24 21:26:34.106739 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.106711 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-s8vsk" Apr 24 21:26:34.114535 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.114509 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-s8vsk" Apr 24 21:26:34.165645 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:34.165403 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe2bfd82c265058eef2da37b1062af3f.slice/crio-a2e561d5b6da1a01b878cc3212a290ea8b5a920adede7aeb139b738a687f8d7c WatchSource:0}: Error finding container a2e561d5b6da1a01b878cc3212a290ea8b5a920adede7aeb139b738a687f8d7c: Status 404 returned error can't find the container with id a2e561d5b6da1a01b878cc3212a290ea8b5a920adede7aeb139b738a687f8d7c Apr 24 21:26:34.165943 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:34.165921 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf546f21f04c436cdaba8d6de0871ab03.slice/crio-d4b2e7b4b17ce4f32879f6343a58a6bbcc87538627cf79605ee617bcad73286f WatchSource:0}: Error finding container d4b2e7b4b17ce4f32879f6343a58a6bbcc87538627cf79605ee617bcad73286f: Status 404 returned error can't find the container with id d4b2e7b4b17ce4f32879f6343a58a6bbcc87538627cf79605ee617bcad73286f Apr 24 21:26:34.168914 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.168893 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.170778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.170763 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:34.253153 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.253125 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:34.269038 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.269014 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.271203 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.271185 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:34.369715 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.369686 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.470270 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.470177 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.570951 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:34.570917 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-48.ec2.internal\" not found" Apr 24 21:26:34.648887 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.648856 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:34.670881 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.670850 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" Apr 24 21:26:34.684817 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.684789 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:34.685726 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.685699 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" Apr 24 21:26:34.693710 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:34.693687 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:35.046602 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.046564 2574 apiserver.go:52] "Watching apiserver" Apr 24 21:26:35.055914 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.055884 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:35.056317 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.056290 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-f9nvj","openshift-network-diagnostics/network-check-target-wvpqz","openshift-network-operator/iptables-alerter-4fzdt","openshift-ovn-kubernetes/ovnkube-node-xf75n","kube-system/konnectivity-agent-rdccs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6","openshift-dns/node-resolver-g2zr8","openshift-image-registry/node-ca-pmknh","openshift-multus/network-metrics-daemon-jtqkc","kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal","openshift-cluster-node-tuning-operator/tuned-nwk6j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal","openshift-multus/multus-additional-cni-plugins-dxnlv"] Apr 24 21:26:35.059559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.059504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.060785 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.060762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:35.060902 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.060838 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:35.061485 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.061468 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.061944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.061797 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.061944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.061827 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.061944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.061878 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ntglr\"" Apr 24 21:26:35.062174 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.062158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.063832 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.063700 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.063832 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.063732 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.063996 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.063858 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:35.063996 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.063964 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.064864 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.064840 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:35.064965 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.064918 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:35.065271 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.065255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.065373 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.065320 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gv74c\"" Apr 24 21:26:35.065696 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.065676 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.065894 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.065876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-752gp\"" Apr 24 21:26:35.065978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.065961 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:35.066095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066057 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:35.066095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066058 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.066200 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066095 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:35.066247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066234 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9cvct\"" Apr 24 21:26:35.066585 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066564 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:35.066981 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.066961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.067804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.067583 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.067804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.067631 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.067919 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.067859 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:35.067919 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.067875 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jn2n\"" Apr 24 21:26:35.068763 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.068737 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.069496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.069275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.069496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.069296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:35.069496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.069359 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:35.069496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.069400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7gtpj\"" Apr 24 21:26:35.069496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.069278 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.070624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.070601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.070734 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.070706 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:35.071147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.071123 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.071225 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.071166 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:35.071674 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.071651 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.071764 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.071702 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mmf58\"" Apr 24 21:26:35.072415 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.072304 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.073871 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.073847 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.077197 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:35.077197 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077131 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:35.077197 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jztmk\"" Apr 24 21:26:35.077564 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077259 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2qtt6\"" Apr 24 21:26:35.077748 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077712 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:35.077839 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.077777 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:35.084907 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.084884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-bin\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085008 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.084975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-host\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.085071 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-sys\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.085071 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.085172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-registration-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.085172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfg5\" (UniqueName: \"kubernetes.io/projected/998844f7-6ade-465d-8041-762411d1f8e2-kube-api-access-dkfg5\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.085172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-kubelet\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085156 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgjk\" (UniqueName: \"kubernetes.io/projected/c6abad34-23d6-4992-ab13-a2cf5ff8141a-kube-api-access-hjgjk\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-tuned\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-tmp-dir\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhl5\" (UniqueName: \"kubernetes.io/projected/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kube-api-access-gqhl5\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-etc-kubernetes\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-conf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-kubelet\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-socket-dir-parent\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-netns\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-systemd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-bin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrk9\" (UniqueName: \"kubernetes.io/projected/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-kube-api-access-rvrk9\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-systemd\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-slash\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-netd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-config\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cni-binary-copy\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-multus-certs\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-kubernetes\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.085774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-hosts-file\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-sys-fs\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/998844f7-6ade-465d-8041-762411d1f8e2-iptables-alerter-script\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-netns\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-modprobe-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-run\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9fdh\" (UniqueName: \"kubernetes.io/projected/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-kube-api-access-t9fdh\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.085952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/998844f7-6ade-465d-8041-762411d1f8e2-host-slash\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-etc-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-lib-modules\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-device-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-etc-selinux\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-node-log\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.086498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-system-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-var-lib-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-daemon-config\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-serviceca\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-systemd-units\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-ovn\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-conf-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysconfig\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpzf\" (UniqueName: \"kubernetes.io/projected/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-kube-api-access-mbpzf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prg69\" (UniqueName: \"kubernetes.io/projected/e2af22c1-baca-4054-87ff-daf77606438a-kube-api-access-prg69\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbqdx\" (UniqueName: \"kubernetes.io/projected/1a62288c-a1f0-46d2-b77f-e15d23159b1a-kube-api-access-gbqdx\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cnibin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-k8s-cni-cncf-io\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-hostroot\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-var-lib-kubelet\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-tmp\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.087241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-script-lib\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0591fbe5-106c-49b3-9a9e-61b9d9370f91-agent-certs\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0591fbe5-106c-49b3-9a9e-61b9d9370f91-konnectivity-ca\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-multus\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-socket-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-log-socket\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-env-overrides\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-os-release\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.088094 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.086796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-host\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.115797 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.115762 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:34 +0000 UTC" deadline="2027-11-15 06:27:19.026961327 +0000 UTC" Apr 24 21:26:35.115797 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.115795 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13665h0m43.911169474s" Apr 24 21:26:35.148075 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.148015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" event={"ID":"be2bfd82c265058eef2da37b1062af3f","Type":"ContainerStarted","Data":"a2e561d5b6da1a01b878cc3212a290ea8b5a920adede7aeb139b738a687f8d7c"} Apr 24 21:26:35.149655 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.149630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" event={"ID":"f546f21f04c436cdaba8d6de0871ab03","Type":"ContainerStarted","Data":"d4b2e7b4b17ce4f32879f6343a58a6bbcc87538627cf79605ee617bcad73286f"} Apr 24 21:26:35.172095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.172064 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:35.190060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.188026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-netns\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.188096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-systemd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.188200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-netns\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.188362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-systemd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190376 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190429 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-bin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.190473 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrk9\" (UniqueName: \"kubernetes.io/projected/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-kube-api-access-rvrk9\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.190522 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-systemd\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.190522 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-slash\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190522 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190501 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:35.190640 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-netd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190640 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.190640 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190640 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-config\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cni-binary-copy\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-multus-certs\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-kubernetes\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-hosts-file\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-sys-fs\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.190806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/998844f7-6ade-465d-8041-762411d1f8e2-iptables-alerter-script\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.191055 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-netns\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.191055 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-bin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.191055 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-netns\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.191055 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.190930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191046 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-slash\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-netd\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-modprobe-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-hosts-file\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-run\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9fdh\" (UniqueName: \"kubernetes.io/projected/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-kube-api-access-t9fdh\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.191266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-modprobe-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-multus-certs\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191325 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-kubernetes\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-run\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcn28\" (UniqueName: \"kubernetes.io/projected/f9c892b6-3ea7-435d-a215-90c3211e772b-kube-api-access-tcn28\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/998844f7-6ade-465d-8041-762411d1f8e2-host-slash\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-etc-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-d\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-sys-fs\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-lib-modules\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-device-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-etc-selinux\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/998844f7-6ade-465d-8041-762411d1f8e2-host-slash\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.191700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-node-log\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-system-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-etc-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-var-lib-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191634 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-config\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-node-log\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-daemon-config\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-serviceca\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.191987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-cnibin\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-device-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-var-lib-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-system-cni-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192240 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/998844f7-6ade-465d-8041-762411d1f8e2-iptables-alerter-script\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-lib-modules\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.192389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-os-release\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-etc-selinux\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-serviceca\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cni-binary-copy\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-systemd\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-systemd-units\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-ovn\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192580 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-ovn\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-conf-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-systemd-units\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-conf-dir\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysconfig\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpzf\" (UniqueName: \"kubernetes.io/projected/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-kube-api-access-mbpzf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-binary-copy\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysconfig\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prg69\" (UniqueName: \"kubernetes.io/projected/e2af22c1-baca-4054-87ff-daf77606438a-kube-api-access-prg69\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbqdx\" (UniqueName: \"kubernetes.io/projected/1a62288c-a1f0-46d2-b77f-e15d23159b1a-kube-api-access-gbqdx\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-daemon-config\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cnibin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.192978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-cnibin\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-k8s-cni-cncf-io\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-hostroot\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-var-lib-kubelet\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193111 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-run-k8s-cni-cncf-io\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-hostroot\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-tmp\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-var-lib-kubelet\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-system-cni-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-script-lib\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.193424 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0591fbe5-106c-49b3-9a9e-61b9d9370f91-agent-certs\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.193528 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:35.693477992 +0000 UTC m=+3.040384339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0591fbe5-106c-49b3-9a9e-61b9d9370f91-konnectivity-ca\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-multus\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.193978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-socket-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-log-socket\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-env-overrides\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-os-release\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-host\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-bin\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-host\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-sys\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovnkube-script-lib\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-cni-multus\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-registration-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.193970 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-registration-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfg5\" (UniqueName: \"kubernetes.io/projected/998844f7-6ade-465d-8041-762411d1f8e2-kube-api-access-dkfg5\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0591fbe5-106c-49b3-9a9e-61b9d9370f91-konnectivity-ca\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-kubelet\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.194867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194152 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-socket-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-host-var-lib-kubelet\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgjk\" (UniqueName: \"kubernetes.io/projected/c6abad34-23d6-4992-ab13-a2cf5ff8141a-kube-api-access-hjgjk\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-tuned\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-tmp-dir\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhl5\" (UniqueName: \"kubernetes.io/projected/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kube-api-access-gqhl5\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-etc-kubernetes\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-conf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-kubelet\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.194903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-socket-dir-parent\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-cni-bin\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-host-kubelet\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-multus-socket-dir-parent\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-os-release\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-host\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-sys\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.195738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-host\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-run-openvswitch\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-tmp-dir\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a62288c-a1f0-46d2-b77f-e15d23159b1a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.195736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6abad34-23d6-4992-ab13-a2cf5ff8141a-etc-kubernetes\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.196105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a62288c-a1f0-46d2-b77f-e15d23159b1a-env-overrides\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.196154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-sysctl-conf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.196192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0591fbe5-106c-49b3-9a9e-61b9d9370f91-agent-certs\") pod \"konnectivity-agent-rdccs\" (UID: \"0591fbe5-106c-49b3-9a9e-61b9d9370f91\") " pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.196280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a62288c-a1f0-46d2-b77f-e15d23159b1a-log-socket\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.196559 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.196387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-tmp\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.197055 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.197036 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:35.197146 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.197061 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:35.197146 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.197073 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:35.197146 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.197128 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:26:35.697115014 +0000 UTC m=+3.044021352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:35.198212 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.198191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-etc-tuned\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.199472 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.199451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9fdh\" (UniqueName: \"kubernetes.io/projected/eab474e7-7b20-4ca7-aedb-49915bb5ec3e-kube-api-access-t9fdh\") pod \"node-resolver-g2zr8\" (UID: \"eab474e7-7b20-4ca7-aedb-49915bb5ec3e\") " pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.199672 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.199651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrk9\" (UniqueName: \"kubernetes.io/projected/5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7-kube-api-access-rvrk9\") pod \"node-ca-pmknh\" (UID: \"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7\") " pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.200659 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.200618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbqdx\" (UniqueName: \"kubernetes.io/projected/1a62288c-a1f0-46d2-b77f-e15d23159b1a-kube-api-access-gbqdx\") pod \"ovnkube-node-xf75n\" (UID: \"1a62288c-a1f0-46d2-b77f-e15d23159b1a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.201209 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.201188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prg69\" (UniqueName: \"kubernetes.io/projected/e2af22c1-baca-4054-87ff-daf77606438a-kube-api-access-prg69\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.201301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.201250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpzf\" (UniqueName: \"kubernetes.io/projected/33bbf579-be90-4be9-aa5b-30ac9af3f6d2-kube-api-access-mbpzf\") pod \"tuned-nwk6j\" (UID: \"33bbf579-be90-4be9-aa5b-30ac9af3f6d2\") " pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.205551 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.205527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgjk\" (UniqueName: \"kubernetes.io/projected/c6abad34-23d6-4992-ab13-a2cf5ff8141a-kube-api-access-hjgjk\") pod \"multus-f9nvj\" (UID: \"c6abad34-23d6-4992-ab13-a2cf5ff8141a\") " pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.207996 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.207972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhl5\" (UniqueName: \"kubernetes.io/projected/e6951713-4d29-44cb-82cd-3fe7e6cc1769-kube-api-access-gqhl5\") pod \"aws-ebs-csi-driver-node-pm4q6\" (UID: \"e6951713-4d29-44cb-82cd-3fe7e6cc1769\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.208099 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.208082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfg5\" (UniqueName: \"kubernetes.io/projected/998844f7-6ade-465d-8041-762411d1f8e2-kube-api-access-dkfg5\") pod \"iptables-alerter-4fzdt\" (UID: \"998844f7-6ade-465d-8041-762411d1f8e2\") " pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.295275 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295481 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcn28\" (UniqueName: \"kubernetes.io/projected/f9c892b6-3ea7-435d-a215-90c3211e772b-kube-api-access-tcn28\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295481 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-cnibin\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295481 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-os-release\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295645 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-binary-copy\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295645 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-system-cni-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295645 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295645 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295878 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.295949 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.295934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-cnibin\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.296016 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.296001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-os-release\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.296071 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.296048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-system-cni-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.296126 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.296082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.296185 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.296167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c892b6-3ea7-435d-a215-90c3211e772b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.296247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.296227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c892b6-3ea7-435d-a215-90c3211e772b-cni-binary-copy\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.304346 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.304244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcn28\" (UniqueName: \"kubernetes.io/projected/f9c892b6-3ea7-435d-a215-90c3211e772b-kube-api-access-tcn28\") pod \"multus-additional-cni-plugins-dxnlv\" (UID: \"f9c892b6-3ea7-435d-a215-90c3211e772b\") " pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.371142 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.371098 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g2zr8" Apr 24 21:26:35.380171 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.380134 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:35.390818 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.390791 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4fzdt" Apr 24 21:26:35.395518 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.395495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:35.401248 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.401225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" Apr 24 21:26:35.401396 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.401226 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:35.407425 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.407400 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f9nvj" Apr 24 21:26:35.413072 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.413052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pmknh" Apr 24 21:26:35.419696 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.419666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" Apr 24 21:26:35.423372 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.423352 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" Apr 24 21:26:35.697702 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.697618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:35.697702 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:35.697671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697776 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697783 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697805 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697817 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697835 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:36.697820934 +0000 UTC m=+4.044727263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:35.697897 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:35.697861 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:26:36.697848791 +0000 UTC m=+4.044755115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:35.745223 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.745174 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6abad34_23d6_4992_ab13_a2cf5ff8141a.slice/crio-7d522c171548d7a29a579fc9cf691ad0063b0f0889ce7fba70c29765c300dfe0 WatchSource:0}: Error finding container 7d522c171548d7a29a579fc9cf691ad0063b0f0889ce7fba70c29765c300dfe0: Status 404 returned error can't find the container with id 7d522c171548d7a29a579fc9cf691ad0063b0f0889ce7fba70c29765c300dfe0 Apr 24 21:26:35.746847 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.746715 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33bbf579_be90_4be9_aa5b_30ac9af3f6d2.slice/crio-6e8749023857a3f750ea100407d9d78e83aae9ffd0e5f007773dc9ef906bc80f WatchSource:0}: Error finding container 6e8749023857a3f750ea100407d9d78e83aae9ffd0e5f007773dc9ef906bc80f: Status 404 returned error can't find the container with id 6e8749023857a3f750ea100407d9d78e83aae9ffd0e5f007773dc9ef906bc80f Apr 24 21:26:35.749805 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.749781 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998844f7_6ade_465d_8041_762411d1f8e2.slice/crio-e57978c6190ba5646662f4fcf7006dba7669425dfe62a17a6c413e6c3d508313 WatchSource:0}: Error finding container e57978c6190ba5646662f4fcf7006dba7669425dfe62a17a6c413e6c3d508313: Status 404 returned error can't find the container with id e57978c6190ba5646662f4fcf7006dba7669425dfe62a17a6c413e6c3d508313 Apr 24 21:26:35.753170 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.753133 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a62288c_a1f0_46d2_b77f_e15d23159b1a.slice/crio-c33c63fd7a9d4a0cc4e608e83672c3103ce5744f0ca842350ac29bed8f3a5cec WatchSource:0}: Error finding container c33c63fd7a9d4a0cc4e608e83672c3103ce5744f0ca842350ac29bed8f3a5cec: Status 404 returned error can't find the container with id c33c63fd7a9d4a0cc4e608e83672c3103ce5744f0ca842350ac29bed8f3a5cec Apr 24 21:26:35.754034 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.754010 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c88e2e4_e223_40b9_b7e7_7cc6d01a43d7.slice/crio-a53e49e0902c78d83c713ef19dee8519485c232a030bd2c1d178196468d5ece2 WatchSource:0}: Error finding container a53e49e0902c78d83c713ef19dee8519485c232a030bd2c1d178196468d5ece2: Status 404 returned error can't find the container with id a53e49e0902c78d83c713ef19dee8519485c232a030bd2c1d178196468d5ece2 Apr 24 21:26:35.755534 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.755511 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c892b6_3ea7_435d_a215_90c3211e772b.slice/crio-6d799f29e750a3c6a596e0c7d623b5ec2e223b39e2f9030d150048be4dba7ca6 WatchSource:0}: Error finding container 6d799f29e750a3c6a596e0c7d623b5ec2e223b39e2f9030d150048be4dba7ca6: Status 404 returned error can't find the container with id 6d799f29e750a3c6a596e0c7d623b5ec2e223b39e2f9030d150048be4dba7ca6 Apr 24 21:26:35.756478 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.756458 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0591fbe5_106c_49b3_9a9e_61b9d9370f91.slice/crio-a0cb39500bf799746162f8c86b2583a4e4013a7afb48fa916d88bb2480fd81f3 WatchSource:0}: Error finding container a0cb39500bf799746162f8c86b2583a4e4013a7afb48fa916d88bb2480fd81f3: Status 404 returned error can't find the container with id a0cb39500bf799746162f8c86b2583a4e4013a7afb48fa916d88bb2480fd81f3 Apr 24 21:26:35.757688 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.757659 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab474e7_7b20_4ca7_aedb_49915bb5ec3e.slice/crio-c85041e585d78af619210c7c786f96166d52bf7eef0ddc2e5d1909ac43627e7c WatchSource:0}: Error finding container c85041e585d78af619210c7c786f96166d52bf7eef0ddc2e5d1909ac43627e7c: Status 404 returned error can't find the container with id c85041e585d78af619210c7c786f96166d52bf7eef0ddc2e5d1909ac43627e7c Apr 24 21:26:35.758399 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:26:35.758291 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6951713_4d29_44cb_82cd_3fe7e6cc1769.slice/crio-6bb15629b67bd6dc6827ce0eda62337fdfbd61fc729c9cf07a0584b8cb6d439a WatchSource:0}: Error finding container 6bb15629b67bd6dc6827ce0eda62337fdfbd61fc729c9cf07a0584b8cb6d439a: Status 404 returned error can't find the container with id 6bb15629b67bd6dc6827ce0eda62337fdfbd61fc729c9cf07a0584b8cb6d439a Apr 24 21:26:36.116596 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.116485 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:34 +0000 UTC" deadline="2027-10-01 16:14:05.367141441 +0000 UTC" Apr 24 21:26:36.116596 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.116534 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12594h47m29.250614238s" Apr 24 21:26:36.142884 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.142851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:36.143068 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.143028 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:36.143580 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.143523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:36.143701 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.143655 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:36.161954 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.161911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f9nvj" event={"ID":"c6abad34-23d6-4992-ab13-a2cf5ff8141a","Type":"ContainerStarted","Data":"7d522c171548d7a29a579fc9cf691ad0063b0f0889ce7fba70c29765c300dfe0"} Apr 24 21:26:36.170580 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.170430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g2zr8" event={"ID":"eab474e7-7b20-4ca7-aedb-49915bb5ec3e","Type":"ContainerStarted","Data":"c85041e585d78af619210c7c786f96166d52bf7eef0ddc2e5d1909ac43627e7c"} Apr 24 21:26:36.173989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.173925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerStarted","Data":"6d799f29e750a3c6a596e0c7d623b5ec2e223b39e2f9030d150048be4dba7ca6"} Apr 24 21:26:36.177610 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.177533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rdccs" event={"ID":"0591fbe5-106c-49b3-9a9e-61b9d9370f91","Type":"ContainerStarted","Data":"a0cb39500bf799746162f8c86b2583a4e4013a7afb48fa916d88bb2480fd81f3"} Apr 24 21:26:36.178978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.178943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"c33c63fd7a9d4a0cc4e608e83672c3103ce5744f0ca842350ac29bed8f3a5cec"} Apr 24 21:26:36.183482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.183446 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" event={"ID":"be2bfd82c265058eef2da37b1062af3f","Type":"ContainerStarted","Data":"1b1dad92926f7d675f23daebc5ce878f9f94b8fce9a87d3194abc26cebacf624"} Apr 24 21:26:36.190805 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.190777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" event={"ID":"e6951713-4d29-44cb-82cd-3fe7e6cc1769","Type":"ContainerStarted","Data":"6bb15629b67bd6dc6827ce0eda62337fdfbd61fc729c9cf07a0584b8cb6d439a"} Apr 24 21:26:36.197585 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.197555 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pmknh" event={"ID":"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7","Type":"ContainerStarted","Data":"a53e49e0902c78d83c713ef19dee8519485c232a030bd2c1d178196468d5ece2"} Apr 24 21:26:36.205810 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.205778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" event={"ID":"33bbf579-be90-4be9-aa5b-30ac9af3f6d2","Type":"ContainerStarted","Data":"6e8749023857a3f750ea100407d9d78e83aae9ffd0e5f007773dc9ef906bc80f"} Apr 24 21:26:36.215410 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.214253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fzdt" event={"ID":"998844f7-6ade-465d-8041-762411d1f8e2","Type":"ContainerStarted","Data":"e57978c6190ba5646662f4fcf7006dba7669425dfe62a17a6c413e6c3d508313"} Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.713489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:36.713582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714070 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714154 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:38.71413261 +0000 UTC m=+6.061038953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714683 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714702 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714715 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:36.714807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:36.714774 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:26:38.714756677 +0000 UTC m=+6.061663002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:37.236637 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:37.236535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" event={"ID":"f546f21f04c436cdaba8d6de0871ab03","Type":"ContainerStarted","Data":"0cfedd834d7a0d8ceeb8a6d02c6135991ae93e6b1473f3939468d52f46d5ecb7"} Apr 24 21:26:37.249976 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:37.249909 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-48.ec2.internal" podStartSLOduration=3.2498909129999998 podStartE2EDuration="3.249890913s" podCreationTimestamp="2026-04-24 21:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:36.198982272 +0000 UTC m=+3.545888620" watchObservedRunningTime="2026-04-24 21:26:37.249890913 +0000 UTC m=+4.596797261" Apr 24 21:26:38.044794 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.044758 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-w7g56"] Apr 24 21:26:38.047819 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.047791 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.047945 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.047916 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:38.126049 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.126001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.126217 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.126112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-dbus\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.126217 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.126161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-kubelet-config\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.142556 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.142521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:38.142733 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.142676 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:38.143108 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.143086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:38.143205 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.143186 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.227447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.227527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-dbus\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.227569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-kubelet-config\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.227697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-kubelet-config\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.227825 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.227887 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:38.727867162 +0000 UTC m=+6.074773507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:38.228247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.227952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/de5ae8c4-d942-404b-a27a-2ba51dd2184a-dbus\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.254653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.253908 2574 generic.go:358] "Generic (PLEG): container finished" podID="f546f21f04c436cdaba8d6de0871ab03" containerID="0cfedd834d7a0d8ceeb8a6d02c6135991ae93e6b1473f3939468d52f46d5ecb7" exitCode=0 Apr 24 21:26:38.254653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.253964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" event={"ID":"f546f21f04c436cdaba8d6de0871ab03","Type":"ContainerDied","Data":"0cfedd834d7a0d8ceeb8a6d02c6135991ae93e6b1473f3939468d52f46d5ecb7"} Apr 24 21:26:38.254653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.253992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" event={"ID":"f546f21f04c436cdaba8d6de0871ab03","Type":"ContainerStarted","Data":"3fab8250717a858a7b72a1383f2fb5d627d0b9221ab7e4f57ce567e1ff6cb276"} Apr 24 21:26:38.731560 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.731518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:38.731792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.731611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:38.731792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:38.731646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:38.731792 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.731783 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:38.732001 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.731848 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:39.731828927 +0000 UTC m=+7.078735258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:38.732270 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732234 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:38.732386 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732290 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:42.732274869 +0000 UTC m=+10.079181196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:38.732386 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732380 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:38.732514 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732397 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:38.732514 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732411 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:38.732514 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:38.732444 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:26:42.732433488 +0000 UTC m=+10.079339819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:39.741028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:39.740546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:39.741028 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:39.740694 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:39.741028 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:39.740752 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:41.740733098 +0000 UTC m=+9.087639439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:40.142926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:40.143060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:40.143062 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:40.143159 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:40.143201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:40.143554 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:40.143281 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:41.756940 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:41.756894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:41.757439 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:41.757136 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:41.757439 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:41.757203 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.757182755 +0000 UTC m=+13.104089086 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:42.142643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:42.142643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.142787 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:42.142643 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.142898 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:42.142953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.142940 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:42.767054 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:42.767008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:42.767633 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:42.767089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:42.767633 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767217 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:42.767633 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767270 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:50.767254267 +0000 UTC m=+18.114160591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:42.767818 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767706 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:42.767818 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767727 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:42.767818 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767740 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:42.767818 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:42.767783 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:26:50.767769761 +0000 UTC m=+18.114676091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:44.143232 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:44.143190 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:44.143232 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:44.143234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:44.143778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:44.143190 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:44.143778 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:44.143323 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:44.143778 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:44.143442 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:44.143778 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:44.143537 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:45.789175 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:45.789136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:45.789655 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:45.789311 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:45.789655 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:45.789412 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:26:53.789390268 +0000 UTC m=+21.136296609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:46.142826 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:46.142738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:46.142826 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:46.142762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:46.143041 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:46.142738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:46.143041 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:46.142883 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:46.143041 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:46.142943 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:46.143139 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:46.143030 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:48.142619 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:48.142592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:48.142965 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:48.142592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:48.142965 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:48.142690 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:48.142965 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:48.142592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:48.142965 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:48.142770 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:48.142965 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:48.142879 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:50.142868 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:50.142825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:50.143372 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:50.142883 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:50.143372 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.142960 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:50.143372 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:50.143029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:50.143372 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.143028 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:50.143372 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.143130 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:50.825115 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:50.825079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:50.825138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825246 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825248 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825271 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825285 2574 projected.go:194] Error preparing data for projected volume kube-api-access-nrs5w for pod openshift-network-diagnostics/network-check-target-wvpqz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:50.825298 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825302 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.825284734 +0000 UTC m=+34.172191061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:50.825628 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:50.825326 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w podName:d326ac93-1c28-465f-80fb-a44c5fd5cb0b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.825313545 +0000 UTC m=+34.172219868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nrs5w" (UniqueName: "kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w") pod "network-check-target-wvpqz" (UID: "d326ac93-1c28-465f-80fb-a44c5fd5cb0b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:52.142715 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:52.142684 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:52.143159 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:52.142731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:52.143159 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:52.142683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:52.143159 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:52.142818 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:52.143159 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:52.142919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:52.143159 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:52.143006 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:53.289700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.289474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f9nvj" event={"ID":"c6abad34-23d6-4992-ab13-a2cf5ff8141a","Type":"ContainerStarted","Data":"70f5ff4f52acecaa0c832141ff1a0953a683f0c2d8b3ee287148d393650dc6e0"} Apr 24 21:26:53.290801 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.290777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g2zr8" event={"ID":"eab474e7-7b20-4ca7-aedb-49915bb5ec3e","Type":"ContainerStarted","Data":"75e2a3d4c4a5e9946c9ea88510958517d402a6c2e994f1f5e6b854dcf2d7fa30"} Apr 24 21:26:53.291869 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.291847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerStarted","Data":"505386bcf39acce51a465174fc132e749de1606d3db2bbe917be925d315d2312"} Apr 24 21:26:53.293113 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.293090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rdccs" event={"ID":"0591fbe5-106c-49b3-9a9e-61b9d9370f91","Type":"ContainerStarted","Data":"08ad36ac4212d573090681a4d5ee5f1610e31322e1caf132c54988f18d2018c2"} Apr 24 21:26:53.294611 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.294594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:26:53.294871 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.294854 2574 generic.go:358] "Generic (PLEG): container finished" podID="1a62288c-a1f0-46d2-b77f-e15d23159b1a" containerID="4636d73a48e68391ab1e6fdf5c92e159c58b8ca049c07c9bc2d24fc7fffe3736" exitCode=1 Apr 24 21:26:53.294924 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.294912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"680e7e0277905a8eeb70ca31206fb27d213bdeee881fd0b7ff653664473cb8ec"} Apr 24 21:26:53.294978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.294934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerDied","Data":"4636d73a48e68391ab1e6fdf5c92e159c58b8ca049c07c9bc2d24fc7fffe3736"} Apr 24 21:26:53.294978 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.294962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"50093af7289433fc3c9b0222a530cfa416fdee6479ec5366d935dae8f3651c78"} Apr 24 21:26:53.296060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.296037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" event={"ID":"e6951713-4d29-44cb-82cd-3fe7e6cc1769","Type":"ContainerStarted","Data":"8d33c313addb2ddcc43cdcce631096ef0112d34c5cf7871b3e7c3abb334677e1"} Apr 24 21:26:53.297189 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.297170 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pmknh" event={"ID":"5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7","Type":"ContainerStarted","Data":"38bbb1d790e642e5d81424d803b92ce3b4a4ca1e32d389dd44afe3b225273f42"} Apr 24 21:26:53.298404 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.298376 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" event={"ID":"33bbf579-be90-4be9-aa5b-30ac9af3f6d2","Type":"ContainerStarted","Data":"5db25db76a69b3fc10d5741b3cfc9cf0ef04925a50cadec4c962626318031473"} Apr 24 21:26:53.302896 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.302862 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-48.ec2.internal" podStartSLOduration=19.302851717 podStartE2EDuration="19.302851717s" podCreationTimestamp="2026-04-24 21:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:38.270227444 +0000 UTC m=+5.617133791" watchObservedRunningTime="2026-04-24 21:26:53.302851717 +0000 UTC m=+20.649758068" Apr 24 21:26:53.303005 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.302987 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f9nvj" podStartSLOduration=3.329034738 podStartE2EDuration="20.302982327s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.747109955 +0000 UTC m=+3.094016280" lastFinishedPulling="2026-04-24 21:26:52.721057531 +0000 UTC m=+20.067963869" observedRunningTime="2026-04-24 21:26:53.30273978 +0000 UTC m=+20.649646127" watchObservedRunningTime="2026-04-24 21:26:53.302982327 +0000 UTC m=+20.649888673" Apr 24 21:26:53.315020 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.314975 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rdccs" podStartSLOduration=11.322356116 podStartE2EDuration="20.31496157s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.75877128 +0000 UTC m=+3.105677604" lastFinishedPulling="2026-04-24 21:26:44.751376727 +0000 UTC m=+12.098283058" observedRunningTime="2026-04-24 21:26:53.314313757 +0000 UTC m=+20.661220103" watchObservedRunningTime="2026-04-24 21:26:53.31496157 +0000 UTC m=+20.661867915" Apr 24 21:26:53.377317 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.376674 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g2zr8" podStartSLOduration=3.4363675799999998 podStartE2EDuration="20.376655712s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.760019532 +0000 UTC m=+3.106925870" lastFinishedPulling="2026-04-24 21:26:52.700307663 +0000 UTC m=+20.047214002" observedRunningTime="2026-04-24 21:26:53.375917182 +0000 UTC m=+20.722823531" watchObservedRunningTime="2026-04-24 21:26:53.376655712 +0000 UTC m=+20.723562059" Apr 24 21:26:53.392902 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.392840 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pmknh" podStartSLOduration=3.448201994 podStartE2EDuration="20.392821545s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.755689552 +0000 UTC m=+3.102595876" lastFinishedPulling="2026-04-24 21:26:52.700309088 +0000 UTC m=+20.047215427" observedRunningTime="2026-04-24 21:26:53.392737374 +0000 UTC m=+20.739643717" watchObservedRunningTime="2026-04-24 21:26:53.392821545 +0000 UTC m=+20.739727891" Apr 24 21:26:53.845698 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:53.845662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:53.845837 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:53.845813 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:53.845893 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:53.845882 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret podName:de5ae8c4-d942-404b-a27a-2ba51dd2184a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:09.845867287 +0000 UTC m=+37.192773611 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret") pod "global-pull-secret-syncer-w7g56" (UID: "de5ae8c4-d942-404b-a27a-2ba51dd2184a") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:54.031003 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.030836 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:54.142304 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.142221 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:54.030996967Z","UUID":"e35d303a-8ba2-46f8-9ad1-9327e74d9e16","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:54.142551 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.142489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:54.142605 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.142494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:54.142605 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:54.142582 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:54.142671 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.142654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:54.142731 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:54.142650 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:54.142773 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:54.142713 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:54.145201 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.145182 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:54.145284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.145208 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:54.302230 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.302138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" event={"ID":"e6951713-4d29-44cb-82cd-3fe7e6cc1769","Type":"ContainerStarted","Data":"45def7191519ce7ddd6b5df088ce4911894ac33984021d0af1eaf3602206cd42"} Apr 24 21:26:54.303352 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.303315 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4fzdt" event={"ID":"998844f7-6ade-465d-8041-762411d1f8e2","Type":"ContainerStarted","Data":"b10b344947a9c9ce9eba4c01767cd50ecf0b55393d3021ca5167a927664f0f96"} Apr 24 21:26:54.304576 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.304551 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="505386bcf39acce51a465174fc132e749de1606d3db2bbe917be925d315d2312" exitCode=0 Apr 24 21:26:54.304656 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.304614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"505386bcf39acce51a465174fc132e749de1606d3db2bbe917be925d315d2312"} Apr 24 21:26:54.307208 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.307192 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:26:54.307524 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.307499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"0f4b888d1549555e0f3f6bf704bace81612023d6b887c666597f9d51c582f411"} Apr 24 21:26:54.307608 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.307535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"d3b114b6ddf1ae35af92c81ff11dab60616500c3490a938665ec678b66d53798"} Apr 24 21:26:54.307608 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.307550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"2adc14308da343181c787df294356580ea9aeebf2a4f593185ab05a3595b7803"} Apr 24 21:26:54.317481 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.317435 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4fzdt" podStartSLOduration=4.36942541 podStartE2EDuration="21.317422315s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.752396059 +0000 UTC m=+3.099302396" lastFinishedPulling="2026-04-24 21:26:52.700392965 +0000 UTC m=+20.047299301" observedRunningTime="2026-04-24 21:26:54.316598504 +0000 UTC m=+21.663504851" watchObservedRunningTime="2026-04-24 21:26:54.317422315 +0000 UTC m=+21.664328660" Apr 24 21:26:54.317884 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:54.317861 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nwk6j" podStartSLOduration=4.326175465 podStartE2EDuration="21.317854303s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.748499019 +0000 UTC m=+3.095405343" lastFinishedPulling="2026-04-24 21:26:52.740177842 +0000 UTC m=+20.087084181" observedRunningTime="2026-04-24 21:26:53.408213308 +0000 UTC m=+20.755119654" watchObservedRunningTime="2026-04-24 21:26:54.317854303 +0000 UTC m=+21.664760652" Apr 24 21:26:55.311954 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:55.311856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" event={"ID":"e6951713-4d29-44cb-82cd-3fe7e6cc1769","Type":"ContainerStarted","Data":"4502e844b615ac97f445d2fe2343de63a178f26faafc11eb65d917df4b08ccb4"} Apr 24 21:26:55.422581 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:55.422543 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:55.423271 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:55.423249 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:55.455052 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:55.454969 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pm4q6" podStartSLOduration=3.1682706290000002 podStartE2EDuration="22.45494826s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.760715263 +0000 UTC m=+3.107621587" lastFinishedPulling="2026-04-24 21:26:55.047392879 +0000 UTC m=+22.394299218" observedRunningTime="2026-04-24 21:26:55.335935858 +0000 UTC m=+22.682842205" watchObservedRunningTime="2026-04-24 21:26:55.45494826 +0000 UTC m=+22.801854602" Apr 24 21:26:56.143638 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.143543 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:56.143789 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.143543 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:56.143789 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:56.143672 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:56.143789 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.143544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:56.143789 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:56.143755 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:56.143992 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:56.143812 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:56.316016 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.315980 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:26:56.316570 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.316432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"3ce501c62efef2e549bf86dd57d5774c8f667a37bdc9c44eed8ed46537c4789d"} Apr 24 21:26:56.316719 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.316691 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:56.317193 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:56.317173 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rdccs" Apr 24 21:26:58.142637 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:58.142612 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:26:58.143314 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:58.142727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:26:58.143314 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:58.142758 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:26:58.143314 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:58.142826 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:26:58.143314 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:58.142726 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:26:58.143314 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:26:58.142902 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:26:58.325503 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:58.325245 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:26:59.328733 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.328700 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="126699c369430540c4ddbfa1ca98fc1ff6f6689fae87b66861e0fdf4b211e0ec" exitCode=0 Apr 24 21:26:59.329172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.328791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"126699c369430540c4ddbfa1ca98fc1ff6f6689fae87b66861e0fdf4b211e0ec"} Apr 24 21:26:59.331721 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.331706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:26:59.332018 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.331989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"89d3dcece7e10b368327e574ee61dc7b9c8bd659bf932a8c97199ff278ca78d3"} Apr 24 21:26:59.332301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.332280 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:59.332403 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.332312 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:59.332403 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.332325 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:59.332548 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.332468 2574 scope.go:117] "RemoveContainer" containerID="4636d73a48e68391ab1e6fdf5c92e159c58b8ca049c07c9bc2d24fc7fffe3736" Apr 24 21:26:59.350325 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.350304 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:26:59.350417 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:26:59.350392 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:27:00.143079 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.142850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:00.143211 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.142851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:00.143211 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.143101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:27:00.143211 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.143180 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:27:00.143211 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.142850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:00.143379 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.143240 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:27:00.229710 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.229679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-w7g56"] Apr 24 21:27:00.232810 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.232779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wvpqz"] Apr 24 21:27:00.233199 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.233184 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jtqkc"] Apr 24 21:27:00.338385 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.338356 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:27:00.338803 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.338784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" event={"ID":"1a62288c-a1f0-46d2-b77f-e15d23159b1a","Type":"ContainerStarted","Data":"2b1cf71c4d3ed72e27f93f6b984751fbf211a6a99dfad89dbda02e8ea3e84451"} Apr 24 21:27:00.340537 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.340514 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="c88f2c7893b14e3d67420356e4a9ff563c3ce20218b5c4ed7392e4c322f8d1e5" exitCode=0 Apr 24 21:27:00.340648 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.340588 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:00.340648 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.340605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"c88f2c7893b14e3d67420356e4a9ff563c3ce20218b5c4ed7392e4c322f8d1e5"} Apr 24 21:27:00.340735 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.340655 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:00.340767 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.340745 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:27:00.340834 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.340820 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:27:00.340867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.340850 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:00.340913 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:00.340901 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:27:00.371392 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:00.371321 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" podStartSLOduration=10.341442207 podStartE2EDuration="27.371308105s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.755367428 +0000 UTC m=+3.102273755" lastFinishedPulling="2026-04-24 21:26:52.785233329 +0000 UTC m=+20.132139653" observedRunningTime="2026-04-24 21:27:00.370127702 +0000 UTC m=+27.717034058" watchObservedRunningTime="2026-04-24 21:27:00.371308105 +0000 UTC m=+27.718214450" Apr 24 21:27:01.344249 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:01.344214 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="d9c1350c826552e95b4ada68da1b2022f6a5915f6562399db7f277a9d9ceff68" exitCode=0 Apr 24 21:27:01.344717 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:01.344252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"d9c1350c826552e95b4ada68da1b2022f6a5915f6562399db7f277a9d9ceff68"} Apr 24 21:27:02.142683 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:02.142651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:02.142870 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:02.142654 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:02.142870 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:02.142779 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:27:02.142870 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:02.142838 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:27:02.142870 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:02.142653 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:02.143062 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:02.142943 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:27:04.143141 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.143109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:04.143953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.143234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:04.143953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.143268 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:04.143953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:04.143232 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvpqz" podUID="d326ac93-1c28-465f-80fb-a44c5fd5cb0b" Apr 24 21:27:04.143953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:04.143381 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:27:04.143953 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:04.143440 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-w7g56" podUID="de5ae8c4-d942-404b-a27a-2ba51dd2184a" Apr 24 21:27:04.935243 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.935217 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-48.ec2.internal" event="NodeReady" Apr 24 21:27:04.935540 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.935386 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:04.979363 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.979308 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-cc4869f54-mx6ks"] Apr 24 21:27:04.982835 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.982808 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jkddd"] Apr 24 21:27:04.983014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.982992 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:04.985562 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.985534 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:04.985919 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.985856 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:27:04.986040 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.985985 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:27:04.986103 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.985985 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:27:04.986103 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.986080 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptvss\"" Apr 24 21:27:04.988620 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.988593 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:27:04.988752 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.988592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:27:04.988752 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.988713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pxhww\"" Apr 24 21:27:04.997529 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.997506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:27:04.999389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:04.999359 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jkddd"] Apr 24 21:27:05.000141 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.000108 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cc4869f54-mx6ks"] Apr 24 21:27:05.010786 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.010581 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-drrd5"] Apr 24 21:27:05.013885 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.013859 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zq6rm"] Apr 24 21:27:05.014095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.014075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.016658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.016619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.016824 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.016800 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:05.017099 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.017070 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:27:05.017952 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.017935 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:05.021906 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.021890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.022107 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.021929 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.022484 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.022464 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:05.022598 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.022533 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:27:05.022927 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.022899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zq6rm"] Apr 24 21:27:05.027166 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.027146 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-drrd5"] Apr 24 21:27:05.127804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.127804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.127804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.127804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzgd5\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.127962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-config-volume\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fpr6\" (UniqueName: \"kubernetes.io/projected/645772fe-eb62-4443-a6dd-6a10b3593053-kube-api-access-6fpr6\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128104 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bhlx\" (UniqueName: \"kubernetes.io/projected/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-kube-api-access-2bhlx\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.128180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.128556 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.128556 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-tmp-dir\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.128556 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.128290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229377 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-config-volume\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fpr6\" (UniqueName: \"kubernetes.io/projected/645772fe-eb62-4443-a6dd-6a10b3593053-kube-api-access-6fpr6\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bhlx\" (UniqueName: \"kubernetes.io/projected/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-kube-api-access-2bhlx\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-tmp-dir\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.229913 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.229936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzgd5\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-config-volume\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.229797 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-tmp-dir\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230262 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230291 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230361 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230566 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230491 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.730472405 +0000 UTC m=+33.077378735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230614 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.730596951 +0000 UTC m=+33.077503290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:05.230618 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230628 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.730619733 +0000 UTC m=+33.077526057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:05.231059 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.230641 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.730633268 +0000 UTC m=+33.077539597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:05.231059 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.231059 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.230904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.234855 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.234719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.234972 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.234732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.243218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.243193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzgd5\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.243460 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.243436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fpr6\" (UniqueName: \"kubernetes.io/projected/645772fe-eb62-4443-a6dd-6a10b3593053-kube-api-access-6fpr6\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.244782 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.244761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bhlx\" (UniqueName: \"kubernetes.io/projected/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-kube-api-access-2bhlx\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.249729 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.249707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.733230 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.733190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.733241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.733282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:05.733313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733376 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733434 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733461 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.73343959 +0000 UTC m=+34.080345915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733437 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733481 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733512 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.73349733 +0000 UTC m=+34.080403667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733376 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:05.733528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733530 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.733519918 +0000 UTC m=+34.080426247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:05.733937 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:05.733592 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.733572769 +0000 UTC m=+34.080479101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:06.143381 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.143272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:06.143381 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.143300 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:06.143381 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.143306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:06.146858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146716 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:06.146858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:06.146858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:06.146858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146777 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:06.146858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146738 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvhzf\"" Apr 24 21:27:06.147189 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.146896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:27:06.740421 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.740382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:06.740421 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.740423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.740454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.740472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740555 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740568 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740588 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740555 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740621 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:08.74060482 +0000 UTC m=+36.087511147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740555 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740636 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:08.740629644 +0000 UTC m=+36.087535968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740649 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:08.740641748 +0000 UTC m=+36.087548071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:06.740887 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.740659 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:08.740653664 +0000 UTC m=+36.087559988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:06.842513 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.842471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:06.842695 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.842541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:06.842752 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.842702 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:06.842799 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:06.842775 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.842755018 +0000 UTC m=+66.189661356 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : secret "metrics-daemon-secret" not found Apr 24 21:27:06.846585 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:06.846558 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrs5w\" (UniqueName: \"kubernetes.io/projected/d326ac93-1c28-465f-80fb-a44c5fd5cb0b-kube-api-access-nrs5w\") pod \"network-check-target-wvpqz\" (UID: \"d326ac93-1c28-465f-80fb-a44c5fd5cb0b\") " pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:07.063250 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:07.063176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:07.318606 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:07.318527 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wvpqz"] Apr 24 21:27:07.323510 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:27:07.323486 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd326ac93_1c28_465f_80fb_a44c5fd5cb0b.slice/crio-fcbcf848db6d1d226e67e1a870d5fe14be34c0eec3e594d7d724370899557ef5 WatchSource:0}: Error finding container fcbcf848db6d1d226e67e1a870d5fe14be34c0eec3e594d7d724370899557ef5: Status 404 returned error can't find the container with id fcbcf848db6d1d226e67e1a870d5fe14be34c0eec3e594d7d724370899557ef5 Apr 24 21:27:07.357682 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:07.357641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wvpqz" event={"ID":"d326ac93-1c28-465f-80fb-a44c5fd5cb0b","Type":"ContainerStarted","Data":"fcbcf848db6d1d226e67e1a870d5fe14be34c0eec3e594d7d724370899557ef5"} Apr 24 21:27:07.360543 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:07.360508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerStarted","Data":"f589f2277186b700f2dc44d5d6f7fad72fdfa6b48f844bb5cd19f59433221812"} Apr 24 21:27:08.364852 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.364818 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="f589f2277186b700f2dc44d5d6f7fad72fdfa6b48f844bb5cd19f59433221812" exitCode=0 Apr 24 21:27:08.365276 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.364878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"f589f2277186b700f2dc44d5d6f7fad72fdfa6b48f844bb5cd19f59433221812"} Apr 24 21:27:08.760576 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.760291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.760603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760456 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.760654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:08.760683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760702 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.760685208 +0000 UTC m=+40.107591531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:08.760760 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760757 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760771 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760810 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760813 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.76079847 +0000 UTC m=+40.107704815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760873 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.760858006 +0000 UTC m=+40.107764334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.760904 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:08.761043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:08.761033 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.761019272 +0000 UTC m=+40.107925609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:09.370266 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:09.370230 2574 generic.go:358] "Generic (PLEG): container finished" podID="f9c892b6-3ea7-435d-a215-90c3211e772b" containerID="1f5433453691192480028b1ae620cb40db56568e1c6660680330fd702e2018e9" exitCode=0 Apr 24 21:27:09.370723 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:09.370277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerDied","Data":"1f5433453691192480028b1ae620cb40db56568e1c6660680330fd702e2018e9"} Apr 24 21:27:09.869369 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:09.869310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:09.873691 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:09.873658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/de5ae8c4-d942-404b-a27a-2ba51dd2184a-original-pull-secret\") pod \"global-pull-secret-syncer-w7g56\" (UID: \"de5ae8c4-d942-404b-a27a-2ba51dd2184a\") " pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:10.069538 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:10.069518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w7g56" Apr 24 21:27:10.197398 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:10.197361 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-w7g56"] Apr 24 21:27:10.200551 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:27:10.200522 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde5ae8c4_d942_404b_a27a_2ba51dd2184a.slice/crio-f1c1a685a946b98e15daaf79d09e86b36c332edb579f9fb83dc100bc4db8545b WatchSource:0}: Error finding container f1c1a685a946b98e15daaf79d09e86b36c332edb579f9fb83dc100bc4db8545b: Status 404 returned error can't find the container with id f1c1a685a946b98e15daaf79d09e86b36c332edb579f9fb83dc100bc4db8545b Apr 24 21:27:10.374388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:10.374285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" event={"ID":"f9c892b6-3ea7-435d-a215-90c3211e772b","Type":"ContainerStarted","Data":"3ae351e0c249e2949e136b613a3181fb3992c2c7a71d65d58fdb43a7cdc055cf"} Apr 24 21:27:10.375390 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:10.375359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-w7g56" event={"ID":"de5ae8c4-d942-404b-a27a-2ba51dd2184a","Type":"ContainerStarted","Data":"f1c1a685a946b98e15daaf79d09e86b36c332edb579f9fb83dc100bc4db8545b"} Apr 24 21:27:10.400742 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:10.400695 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dxnlv" podStartSLOduration=5.983160582 podStartE2EDuration="37.400681751s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:26:35.757538411 +0000 UTC m=+3.104444749" lastFinishedPulling="2026-04-24 21:27:07.175059588 +0000 UTC m=+34.521965918" observedRunningTime="2026-04-24 21:27:10.400416895 +0000 UTC m=+37.747323239" watchObservedRunningTime="2026-04-24 21:27:10.400681751 +0000 UTC m=+37.747588094" Apr 24 21:27:11.379386 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:11.378984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wvpqz" event={"ID":"d326ac93-1c28-465f-80fb-a44c5fd5cb0b","Type":"ContainerStarted","Data":"6e4ab6341b4976d1f9eb13ccccf3bc285a1841f9c016fb9e68f1c68692c53d07"} Apr 24 21:27:11.379778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:11.379468 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:27:11.396597 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:11.396539 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wvpqz" podStartSLOduration=35.136431205 podStartE2EDuration="38.396519969s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:27:07.325422721 +0000 UTC m=+34.672329048" lastFinishedPulling="2026-04-24 21:27:10.585511489 +0000 UTC m=+37.932417812" observedRunningTime="2026-04-24 21:27:11.39568836 +0000 UTC m=+38.742594705" watchObservedRunningTime="2026-04-24 21:27:11.396519969 +0000 UTC m=+38.743426315" Apr 24 21:27:12.794915 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:12.794874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:12.794925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:12.794963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:12.794990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795037 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795094 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795100 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795114 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795118 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:20.795096189 +0000 UTC m=+48.142002520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795037 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795137 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:20.795127737 +0000 UTC m=+48.142034061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795161 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:20.795144998 +0000 UTC m=+48.142051351 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:12.795471 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:12.795185 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:20.795166944 +0000 UTC m=+48.142073279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:14.387046 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:14.386958 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-w7g56" event={"ID":"de5ae8c4-d942-404b-a27a-2ba51dd2184a","Type":"ContainerStarted","Data":"f6d1061ee7a27af5d95097eca98a16ff9807aa86dda21cfca3290584f276b045"} Apr 24 21:27:14.402254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:14.402198 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-w7g56" podStartSLOduration=32.571984921 podStartE2EDuration="36.402184813s" podCreationTimestamp="2026-04-24 21:26:38 +0000 UTC" firstStartedPulling="2026-04-24 21:27:10.202051897 +0000 UTC m=+37.548958220" lastFinishedPulling="2026-04-24 21:27:14.032251786 +0000 UTC m=+41.379158112" observedRunningTime="2026-04-24 21:27:14.401354537 +0000 UTC m=+41.748260876" watchObservedRunningTime="2026-04-24 21:27:14.402184813 +0000 UTC m=+41.749091157" Apr 24 21:27:17.776700 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.776664 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p"] Apr 24 21:27:17.778388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.778372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.781553 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.781517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:17.781553 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.781547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-twfgs\"" Apr 24 21:27:17.781729 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.781585 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:17.781729 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.781592 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:17.781729 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.781517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:27:17.788619 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.788599 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p"] Apr 24 21:27:17.832658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.832627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt655\" (UniqueName: \"kubernetes.io/projected/92fe8caa-8607-4158-bac2-d5f9e631a096-kube-api-access-mt655\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.832658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.832660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/92fe8caa-8607-4158-bac2-d5f9e631a096-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.933526 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.933476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt655\" (UniqueName: \"kubernetes.io/projected/92fe8caa-8607-4158-bac2-d5f9e631a096-kube-api-access-mt655\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.933526 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.933526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/92fe8caa-8607-4158-bac2-d5f9e631a096-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.936715 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.936692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/92fe8caa-8607-4158-bac2-d5f9e631a096-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:17.949272 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:17.949243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt655\" (UniqueName: \"kubernetes.io/projected/92fe8caa-8607-4158-bac2-d5f9e631a096-kube-api-access-mt655\") pod \"managed-serviceaccount-addon-agent-8b57b85fc-ppz8p\" (UID: \"92fe8caa-8607-4158-bac2-d5f9e631a096\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:18.097896 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:18.097803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" Apr 24 21:27:18.213886 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:18.213853 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p"] Apr 24 21:27:18.219795 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:27:18.219766 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92fe8caa_8607_4158_bac2_d5f9e631a096.slice/crio-e34860ec1a953c83af0bedac15da7bde3f744cac0704b7a400db06188b585384 WatchSource:0}: Error finding container e34860ec1a953c83af0bedac15da7bde3f744cac0704b7a400db06188b585384: Status 404 returned error can't find the container with id e34860ec1a953c83af0bedac15da7bde3f744cac0704b7a400db06188b585384 Apr 24 21:27:18.397844 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:18.397752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" event={"ID":"92fe8caa-8607-4158-bac2-d5f9e631a096","Type":"ContainerStarted","Data":"e34860ec1a953c83af0bedac15da7bde3f744cac0704b7a400db06188b585384"} Apr 24 21:27:20.859011 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:20.858965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:20.859056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:20.859092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859105 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859173 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.859153948 +0000 UTC m=+64.206060273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859198 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:20.859219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859229 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859244 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859257 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.859240917 +0000 UTC m=+64.206147246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859288 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.859273957 +0000 UTC m=+64.206180284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859294 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:20.859405 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:20.859322 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.859314414 +0000 UTC m=+64.206220738 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:22.406865 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:22.406825 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" event={"ID":"92fe8caa-8607-4158-bac2-d5f9e631a096","Type":"ContainerStarted","Data":"ce0334440dfdc2a3530c1c99838b807615e64862540e4da103b51c733934f1a6"} Apr 24 21:27:22.425821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:22.425770 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" podStartSLOduration=2.089300673 podStartE2EDuration="5.425757462s" podCreationTimestamp="2026-04-24 21:27:17 +0000 UTC" firstStartedPulling="2026-04-24 21:27:18.221550448 +0000 UTC m=+45.568456771" lastFinishedPulling="2026-04-24 21:27:21.558007235 +0000 UTC m=+48.904913560" observedRunningTime="2026-04-24 21:27:22.42553137 +0000 UTC m=+49.772437717" watchObservedRunningTime="2026-04-24 21:27:22.425757462 +0000 UTC m=+49.772663808" Apr 24 21:27:31.358307 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:31.358277 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf75n" Apr 24 21:27:36.878146 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:36.878105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:27:36.878146 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:36.878150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:36.878177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:36.878200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878268 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878304 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878315 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878346 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.878312313 +0000 UTC m=+96.225218638 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878275 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878273 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878396 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.878369204 +0000 UTC m=+96.225275539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878414 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.878404432 +0000 UTC m=+96.225310756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:27:36.878674 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:36.878427 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.878419346 +0000 UTC m=+96.225325676 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:27:38.895259 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:38.895222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:27:38.895883 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:38.895437 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:38.895883 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:27:38.895525 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:42.895504104 +0000 UTC m=+130.242410440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : secret "metrics-daemon-secret" not found Apr 24 21:27:43.386260 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:27:43.386152 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wvpqz" Apr 24 21:28:08.923574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:28:08.923534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:28:08.923574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:28:08.923584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923699 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923702 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:28:08.923752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923761 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.923744884 +0000 UTC m=+160.270651207 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923769 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:28:08.923795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923826 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.9238128 +0000 UTC m=+160.270719123 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923882 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923888 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923940 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.923928193 +0000 UTC m=+160.270834520 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:28:08.924118 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:08.923958 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.923949362 +0000 UTC m=+160.270855686 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:28:42.969889 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:28:42.969851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:28:42.970520 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:42.970046 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:28:42.970520 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:28:42.970142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs podName:e2af22c1-baca-4054-87ff-daf77606438a nodeName:}" failed. No retries permitted until 2026-04-24 21:30:44.970118942 +0000 UTC m=+252.317025294 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs") pod "network-metrics-daemon-jtqkc" (UID: "e2af22c1-baca-4054-87ff-daf77606438a") : secret "metrics-daemon-secret" not found Apr 24 21:29:07.997447 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:07.997390 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" podUID="82766c41-fe9c-427b-8e96-271dc66e21d8" Apr 24 21:29:08.005579 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:08.005538 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" podUID="1cd5a898-ba76-4c36-ab46-16db7f1b61bd" Apr 24 21:29:08.027865 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:08.027837 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-drrd5" podUID="ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399" Apr 24 21:29:08.034034 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:08.034005 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zq6rm" podUID="645772fe-eb62-4443-a6dd-6a10b3593053" Apr 24 21:29:08.604484 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:08.604454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:29:08.604651 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:08.604454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-drrd5" Apr 24 21:29:08.604651 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:08.604455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:29:08.604719 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:08.604458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:29:09.156043 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:09.156006 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jtqkc" podUID="e2af22c1-baca-4054-87ff-daf77606438a" Apr 24 21:29:12.072628 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.072546 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jpjmc"] Apr 24 21:29:12.077350 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.077312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.082475 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.082449 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:12.082703 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.082688 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:29:12.082753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.082713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-fsz7m\"" Apr 24 21:29:12.082801 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.082777 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:29:12.083433 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.083412 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:29:12.105981 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.105952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jpjmc"] Apr 24 21:29:12.122483 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.122459 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:29:12.187302 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.187263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a5443-bbde-4ab4-bb73-46a6160644d2-serving-cert\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.187523 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.187347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-trusted-ca\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.187523 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.187376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqkwf\" (UniqueName: \"kubernetes.io/projected/219a5443-bbde-4ab4-bb73-46a6160644d2-kube-api-access-nqkwf\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.187523 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.187415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-config\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.287932 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.287894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-trusted-ca\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.287932 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.287930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqkwf\" (UniqueName: \"kubernetes.io/projected/219a5443-bbde-4ab4-bb73-46a6160644d2-kube-api-access-nqkwf\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.288166 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.287956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-config\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.288166 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.288058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a5443-bbde-4ab4-bb73-46a6160644d2-serving-cert\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.288780 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.288757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-config\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.288880 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.288802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/219a5443-bbde-4ab4-bb73-46a6160644d2-trusted-ca\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.290476 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.290460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a5443-bbde-4ab4-bb73-46a6160644d2-serving-cert\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.296127 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.296106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqkwf\" (UniqueName: \"kubernetes.io/projected/219a5443-bbde-4ab4-bb73-46a6160644d2-kube-api-access-nqkwf\") pod \"console-operator-9d4b6777b-jpjmc\" (UID: \"219a5443-bbde-4ab4-bb73-46a6160644d2\") " pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.386230 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.386146 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:12.504205 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.504165 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jpjmc"] Apr 24 21:29:12.612106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.612065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" event={"ID":"219a5443-bbde-4ab4-bb73-46a6160644d2","Type":"ContainerStarted","Data":"6457837691a39ca960cc7d0f7e7fca3e857bd42abb5bb9ecd6b04383da03473e"} Apr 24 21:29:12.993452 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.993419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:29:12.993452 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.993468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.993509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:12.993536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") pod \"image-registry-cc4869f54-mx6ks\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993571 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993648 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert podName:1cd5a898-ba76-4c36-ab46-16db7f1b61bd nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.99363175 +0000 UTC m=+282.340538078 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jkddd" (UID: "1cd5a898-ba76-4c36-ab46-16db7f1b61bd") : secret "networking-console-plugin-cert" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993648 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993665 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cc4869f54-mx6ks: secret "image-registry-tls" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993652 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993710 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls podName:82766c41-fe9c-427b-8e96-271dc66e21d8 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.993694282 +0000 UTC m=+282.340600606 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls") pod "image-registry-cc4869f54-mx6ks" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8") : secret "image-registry-tls" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993652 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:12.993755 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993740 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert podName:645772fe-eb62-4443-a6dd-6a10b3593053 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.993726888 +0000 UTC m=+282.340633217 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert") pod "ingress-canary-zq6rm" (UID: "645772fe-eb62-4443-a6dd-6a10b3593053") : secret "canary-serving-cert" not found Apr 24 21:29:12.994092 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:12.993774 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls podName:ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.993761775 +0000 UTC m=+282.340668105 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls") pod "dns-default-drrd5" (UID: "ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399") : secret "dns-default-metrics-tls" not found Apr 24 21:29:13.937612 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:13.937579 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld"] Apr 24 21:29:13.941783 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:13.941762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" Apr 24 21:29:13.943944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:13.943919 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-v9gsh\"" Apr 24 21:29:13.947874 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:13.947848 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld"] Apr 24 21:29:14.000850 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.000811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8gp\" (UniqueName: \"kubernetes.io/projected/e8dd8df3-5e85-4602-8b9c-38eae175e30e-kube-api-access-4m8gp\") pod \"network-check-source-8894fc9bd-7jkld\" (UID: \"e8dd8df3-5e85-4602-8b9c-38eae175e30e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" Apr 24 21:29:14.101771 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.101732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8gp\" (UniqueName: \"kubernetes.io/projected/e8dd8df3-5e85-4602-8b9c-38eae175e30e-kube-api-access-4m8gp\") pod \"network-check-source-8894fc9bd-7jkld\" (UID: \"e8dd8df3-5e85-4602-8b9c-38eae175e30e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" Apr 24 21:29:14.110563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.110533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8gp\" (UniqueName: \"kubernetes.io/projected/e8dd8df3-5e85-4602-8b9c-38eae175e30e-kube-api-access-4m8gp\") pod \"network-check-source-8894fc9bd-7jkld\" (UID: \"e8dd8df3-5e85-4602-8b9c-38eae175e30e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" Apr 24 21:29:14.252957 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.252868 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" Apr 24 21:29:14.483754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.483723 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld"] Apr 24 21:29:14.487428 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:14.487400 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8dd8df3_5e85_4602_8b9c_38eae175e30e.slice/crio-72136fd5dbe0c9c6e0362bcaa54a1ae6df811515dd2b5a9d555bd741784392da WatchSource:0}: Error finding container 72136fd5dbe0c9c6e0362bcaa54a1ae6df811515dd2b5a9d555bd741784392da: Status 404 returned error can't find the container with id 72136fd5dbe0c9c6e0362bcaa54a1ae6df811515dd2b5a9d555bd741784392da Apr 24 21:29:14.618295 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.618262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/0.log" Apr 24 21:29:14.618488 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.618314 2574 generic.go:358] "Generic (PLEG): container finished" podID="219a5443-bbde-4ab4-bb73-46a6160644d2" containerID="dfbffbcca3f6a4ebbb32bb23fdab74238a1e47e28809b996dea935f1be4220d4" exitCode=255 Apr 24 21:29:14.618488 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.618428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" event={"ID":"219a5443-bbde-4ab4-bb73-46a6160644d2","Type":"ContainerDied","Data":"dfbffbcca3f6a4ebbb32bb23fdab74238a1e47e28809b996dea935f1be4220d4"} Apr 24 21:29:14.618660 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.618641 2574 scope.go:117] "RemoveContainer" containerID="dfbffbcca3f6a4ebbb32bb23fdab74238a1e47e28809b996dea935f1be4220d4" Apr 24 21:29:14.619773 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.619746 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" event={"ID":"e8dd8df3-5e85-4602-8b9c-38eae175e30e","Type":"ContainerStarted","Data":"9126b552239aefd12683a2bbe3567d31c85828a225dc78328b46c683ed341732"} Apr 24 21:29:14.619861 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.619785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" event={"ID":"e8dd8df3-5e85-4602-8b9c-38eae175e30e","Type":"ContainerStarted","Data":"72136fd5dbe0c9c6e0362bcaa54a1ae6df811515dd2b5a9d555bd741784392da"} Apr 24 21:29:14.652656 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:14.652605 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7jkld" podStartSLOduration=1.6525900770000002 podStartE2EDuration="1.652590077s" podCreationTimestamp="2026-04-24 21:29:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:29:14.651182858 +0000 UTC m=+161.998089203" watchObservedRunningTime="2026-04-24 21:29:14.652590077 +0000 UTC m=+161.999496423" Apr 24 21:29:15.623635 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.623610 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/1.log" Apr 24 21:29:15.624073 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.623946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/0.log" Apr 24 21:29:15.624073 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.623978 2574 generic.go:358] "Generic (PLEG): container finished" podID="219a5443-bbde-4ab4-bb73-46a6160644d2" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" exitCode=255 Apr 24 21:29:15.624184 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.624061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" event={"ID":"219a5443-bbde-4ab4-bb73-46a6160644d2","Type":"ContainerDied","Data":"c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be"} Apr 24 21:29:15.624184 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.624116 2574 scope.go:117] "RemoveContainer" containerID="dfbffbcca3f6a4ebbb32bb23fdab74238a1e47e28809b996dea935f1be4220d4" Apr 24 21:29:15.624322 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:15.624305 2574 scope.go:117] "RemoveContainer" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" Apr 24 21:29:15.624526 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:15.624508 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:16.627969 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:16.627941 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/1.log" Apr 24 21:29:16.628360 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:16.628248 2574 scope.go:117] "RemoveContainer" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" Apr 24 21:29:16.628447 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:16.628427 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:17.437573 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.437534 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h"] Apr 24 21:29:17.440771 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.440749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" Apr 24 21:29:17.443063 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.443045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wqfv8\"" Apr 24 21:29:17.450083 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.450061 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:29:17.451951 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.451933 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:29:17.463666 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.463639 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h"] Apr 24 21:29:17.528678 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.528636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdthn\" (UniqueName: \"kubernetes.io/projected/e6cceb68-1e95-4268-ba54-7b7980ce8560-kube-api-access-cdthn\") pod \"migrator-74bb7799d9-mkr9h\" (UID: \"e6cceb68-1e95-4268-ba54-7b7980ce8560\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" Apr 24 21:29:17.629364 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.629310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdthn\" (UniqueName: \"kubernetes.io/projected/e6cceb68-1e95-4268-ba54-7b7980ce8560-kube-api-access-cdthn\") pod \"migrator-74bb7799d9-mkr9h\" (UID: \"e6cceb68-1e95-4268-ba54-7b7980ce8560\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" Apr 24 21:29:17.640679 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.640651 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdthn\" (UniqueName: \"kubernetes.io/projected/e6cceb68-1e95-4268-ba54-7b7980ce8560-kube-api-access-cdthn\") pod \"migrator-74bb7799d9-mkr9h\" (UID: \"e6cceb68-1e95-4268-ba54-7b7980ce8560\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" Apr 24 21:29:17.752378 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.752270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" Apr 24 21:29:17.874853 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:17.874822 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h"] Apr 24 21:29:17.877937 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:17.877908 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cceb68_1e95_4268_ba54_7b7980ce8560.slice/crio-2578ee7ec3fb659cefe8f197cd7502dd1f6ec2b5f2f2d2fcef8b2904caaf9881 WatchSource:0}: Error finding container 2578ee7ec3fb659cefe8f197cd7502dd1f6ec2b5f2f2d2fcef8b2904caaf9881: Status 404 returned error can't find the container with id 2578ee7ec3fb659cefe8f197cd7502dd1f6ec2b5f2f2d2fcef8b2904caaf9881 Apr 24 21:29:18.636168 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:18.636125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" event={"ID":"e6cceb68-1e95-4268-ba54-7b7980ce8560","Type":"ContainerStarted","Data":"2578ee7ec3fb659cefe8f197cd7502dd1f6ec2b5f2f2d2fcef8b2904caaf9881"} Apr 24 21:29:18.799778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:18.799751 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g2zr8_eab474e7-7b20-4ca7-aedb-49915bb5ec3e/dns-node-resolver/0.log" Apr 24 21:29:19.639941 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:19.639899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" event={"ID":"e6cceb68-1e95-4268-ba54-7b7980ce8560","Type":"ContainerStarted","Data":"d8d92c9f6c5871fa21e7d48f162c49abe97a4db9310bae7d876512b8cf7ecf83"} Apr 24 21:29:19.639941 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:19.639943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" event={"ID":"e6cceb68-1e95-4268-ba54-7b7980ce8560","Type":"ContainerStarted","Data":"f2902be848003467d048d969a8a06cb7a4c30a9c8d8f989ce68f8093ba9c0c1d"} Apr 24 21:29:19.657505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:19.657478 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pmknh_5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7/node-ca/0.log" Apr 24 21:29:19.708147 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:19.708097 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-mkr9h" podStartSLOduration=1.420526884 podStartE2EDuration="2.70808508s" podCreationTimestamp="2026-04-24 21:29:17 +0000 UTC" firstStartedPulling="2026-04-24 21:29:17.879847385 +0000 UTC m=+165.226753723" lastFinishedPulling="2026-04-24 21:29:19.167405594 +0000 UTC m=+166.514311919" observedRunningTime="2026-04-24 21:29:19.70647767 +0000 UTC m=+167.053384016" watchObservedRunningTime="2026-04-24 21:29:19.70808508 +0000 UTC m=+167.054991425" Apr 24 21:29:22.143426 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.143316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:29:22.386614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.386575 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:22.386785 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.386634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:22.386995 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.386983 2574 scope.go:117] "RemoveContainer" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" Apr 24 21:29:22.387159 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:22.387143 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:22.649496 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.649462 2574 generic.go:358] "Generic (PLEG): container finished" podID="92fe8caa-8607-4158-bac2-d5f9e631a096" containerID="ce0334440dfdc2a3530c1c99838b807615e64862540e4da103b51c733934f1a6" exitCode=255 Apr 24 21:29:22.649647 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.649539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" event={"ID":"92fe8caa-8607-4158-bac2-d5f9e631a096","Type":"ContainerDied","Data":"ce0334440dfdc2a3530c1c99838b807615e64862540e4da103b51c733934f1a6"} Apr 24 21:29:22.655134 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:22.655116 2574 scope.go:117] "RemoveContainer" containerID="ce0334440dfdc2a3530c1c99838b807615e64862540e4da103b51c733934f1a6" Apr 24 21:29:23.653433 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:23.653394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b57b85fc-ppz8p" event={"ID":"92fe8caa-8607-4158-bac2-d5f9e631a096","Type":"ContainerStarted","Data":"dcf785232f948c2cf525fa86c20a17e840edfccc580491fde0d510360e1346de"} Apr 24 21:29:35.143891 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.143856 2574 scope.go:117] "RemoveContainer" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" Apr 24 21:29:35.684775 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.684743 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:29:35.685148 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.685132 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/1.log" Apr 24 21:29:35.685207 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.685166 2574 generic.go:358] "Generic (PLEG): container finished" podID="219a5443-bbde-4ab4-bb73-46a6160644d2" containerID="8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082" exitCode=255 Apr 24 21:29:35.685254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.685209 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" event={"ID":"219a5443-bbde-4ab4-bb73-46a6160644d2","Type":"ContainerDied","Data":"8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082"} Apr 24 21:29:35.685254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.685241 2574 scope.go:117] "RemoveContainer" containerID="c535ff716cb33b9d52edf39ca863a1a7b7fc76e287d7ce81d49b16ab09f176be" Apr 24 21:29:35.685594 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:35.685578 2574 scope.go:117] "RemoveContainer" containerID="8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082" Apr 24 21:29:35.685794 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:35.685773 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:36.688761 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:36.688734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:29:40.611394 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.611360 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mvrpd"] Apr 24 21:29:40.614479 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.614462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.617349 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:40.617308 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-133-48.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-133-48.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 24 21:29:40.617998 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:40.617970 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"insights-runtime-extractor-tls\" is forbidden: User \"system:node:ip-10-0-133-48.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-133-48.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" type="*v1.Secret" Apr 24 21:29:40.618285 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:40.618267 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:ip-10-0-133-48.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-insights\": no relationship found between node 'ip-10-0-133-48.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" type="*v1.ConfigMap" Apr 24 21:29:40.618833 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.618820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b67gj\"" Apr 24 21:29:40.618877 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.618822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:29:40.637090 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.637064 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mvrpd"] Apr 24 21:29:40.705621 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.705589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzdlt\" (UniqueName: \"kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.705621 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.705625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3ca6061-7a72-47f7-9755-4619b0e3b74e-crio-socket\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.705844 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.705715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3ca6061-7a72-47f7-9755-4619b0e3b74e-data-volume\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.705844 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.705788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.705844 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.705812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.806977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3ca6061-7a72-47f7-9755-4619b0e3b74e-data-volume\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdlt\" (UniqueName: \"kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3ca6061-7a72-47f7-9755-4619b0e3b74e-crio-socket\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807347 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c3ca6061-7a72-47f7-9755-4619b0e3b74e-crio-socket\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:40.807393 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:40.807326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c3ca6061-7a72-47f7-9755-4619b0e3b74e-data-volume\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:41.456163 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:41.456126 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:29:41.459933 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:41.459900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:41.807441 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:41.807352 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: failed to sync secret cache: timed out waiting for the condition Apr 24 21:29:41.807441 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:41.807435 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls podName:c3ca6061-7a72-47f7-9755-4619b0e3b74e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:42.307416539 +0000 UTC m=+189.654322868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-mvrpd" (UID: "c3ca6061-7a72-47f7-9755-4619b0e3b74e") : failed to sync secret cache: timed out waiting for the condition Apr 24 21:29:41.823807 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:41.823768 2574 projected.go:289] Couldn't get configMap openshift-insights/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:29:41.823985 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:41.823819 2574 projected.go:194] Error preparing data for projected volume kube-api-access-bzdlt for pod openshift-insights/insights-runtime-extractor-mvrpd: failed to sync configmap cache: timed out waiting for the condition Apr 24 21:29:41.823985 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:41.823892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt podName:c3ca6061-7a72-47f7-9755-4619b0e3b74e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:42.323870529 +0000 UTC m=+189.670776853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bzdlt" (UniqueName: "kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt") pod "insights-runtime-extractor-mvrpd" (UID: "c3ca6061-7a72-47f7-9755-4619b0e3b74e") : failed to sync configmap cache: timed out waiting for the condition Apr 24 21:29:41.878300 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:41.878261 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:29:42.044664 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.044630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:29:42.319499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.319454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:42.321711 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.321681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c3ca6061-7a72-47f7-9755-4619b0e3b74e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:42.386777 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.386726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:42.386777 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.386785 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:29:42.387148 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.387136 2574 scope.go:117] "RemoveContainer" containerID="8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082" Apr 24 21:29:42.387317 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:42.387301 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:42.420182 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.420150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdlt\" (UniqueName: \"kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:42.422681 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.422653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdlt\" (UniqueName: \"kubernetes.io/projected/c3ca6061-7a72-47f7-9755-4619b0e3b74e-kube-api-access-bzdlt\") pod \"insights-runtime-extractor-mvrpd\" (UID: \"c3ca6061-7a72-47f7-9755-4619b0e3b74e\") " pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:42.422816 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.422803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mvrpd" Apr 24 21:29:42.544622 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.544594 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mvrpd"] Apr 24 21:29:42.547921 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:42.547884 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ca6061_7a72_47f7_9755_4619b0e3b74e.slice/crio-286fbf545f906d535c8a070891c01b5808b2beec6eb457b50ad721f4faecb0ba WatchSource:0}: Error finding container 286fbf545f906d535c8a070891c01b5808b2beec6eb457b50ad721f4faecb0ba: Status 404 returned error can't find the container with id 286fbf545f906d535c8a070891c01b5808b2beec6eb457b50ad721f4faecb0ba Apr 24 21:29:42.704138 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.704103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mvrpd" event={"ID":"c3ca6061-7a72-47f7-9755-4619b0e3b74e","Type":"ContainerStarted","Data":"32113b7d46bcccace7ec09799720d58cbdb1eeb897eadf3a750681281e65387d"} Apr 24 21:29:42.704138 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:42.704140 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mvrpd" event={"ID":"c3ca6061-7a72-47f7-9755-4619b0e3b74e","Type":"ContainerStarted","Data":"286fbf545f906d535c8a070891c01b5808b2beec6eb457b50ad721f4faecb0ba"} Apr 24 21:29:43.345778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.345745 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jx79l"] Apr 24 21:29:43.348820 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.348803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.353256 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.353233 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:29:43.353417 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.353399 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:29:43.353456 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.353399 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:29:43.353534 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.353516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-lcpps\"" Apr 24 21:29:43.354591 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.354567 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:29:43.354591 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.354581 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:29:43.363517 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.363495 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jx79l"] Apr 24 21:29:43.528945 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.528899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.529143 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.528967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.529143 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.529011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmvp\" (UniqueName: \"kubernetes.io/projected/d53ae6be-1641-41bb-8724-bcfe224ed319-kube-api-access-ccmvp\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.529143 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.529088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d53ae6be-1641-41bb-8724-bcfe224ed319-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.630049 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.630015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d53ae6be-1641-41bb-8724-bcfe224ed319-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.630242 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.630109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.630242 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.630155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.630242 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.630194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmvp\" (UniqueName: \"kubernetes.io/projected/d53ae6be-1641-41bb-8724-bcfe224ed319-kube-api-access-ccmvp\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.630823 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.630799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d53ae6be-1641-41bb-8724-bcfe224ed319-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.632977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.632940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.632977 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.632958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d53ae6be-1641-41bb-8724-bcfe224ed319-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.641414 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.641388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmvp\" (UniqueName: \"kubernetes.io/projected/d53ae6be-1641-41bb-8724-bcfe224ed319-kube-api-access-ccmvp\") pod \"prometheus-operator-5676c8c784-jx79l\" (UID: \"d53ae6be-1641-41bb-8724-bcfe224ed319\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.658168 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.658142 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" Apr 24 21:29:43.711997 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.711113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mvrpd" event={"ID":"c3ca6061-7a72-47f7-9755-4619b0e3b74e","Type":"ContainerStarted","Data":"f315dae0fa2418d343b5a850e1f4415b69c1dbce34be00ee2e3d158247b065df"} Apr 24 21:29:43.793535 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:43.793443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jx79l"] Apr 24 21:29:43.798450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:43.798410 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd53ae6be_1641_41bb_8724_bcfe224ed319.slice/crio-86ab1f169f792bd095fbc40017548a54e3346e38bffd7d6c36002c5cdcb48100 WatchSource:0}: Error finding container 86ab1f169f792bd095fbc40017548a54e3346e38bffd7d6c36002c5cdcb48100: Status 404 returned error can't find the container with id 86ab1f169f792bd095fbc40017548a54e3346e38bffd7d6c36002c5cdcb48100 Apr 24 21:29:44.715115 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:44.715077 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" event={"ID":"d53ae6be-1641-41bb-8724-bcfe224ed319","Type":"ContainerStarted","Data":"86ab1f169f792bd095fbc40017548a54e3346e38bffd7d6c36002c5cdcb48100"} Apr 24 21:29:44.717112 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:44.717062 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mvrpd" event={"ID":"c3ca6061-7a72-47f7-9755-4619b0e3b74e","Type":"ContainerStarted","Data":"323b0ad9d0df9eee93baa103d3b390c0a0f89d19ee1953b43dd109ec99437a28"} Apr 24 21:29:44.739042 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:44.738985 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mvrpd" podStartSLOduration=2.950143389 podStartE2EDuration="4.738967932s" podCreationTimestamp="2026-04-24 21:29:40 +0000 UTC" firstStartedPulling="2026-04-24 21:29:42.603456813 +0000 UTC m=+189.950363137" lastFinishedPulling="2026-04-24 21:29:44.392281346 +0000 UTC m=+191.739187680" observedRunningTime="2026-04-24 21:29:44.73796516 +0000 UTC m=+192.084871506" watchObservedRunningTime="2026-04-24 21:29:44.738967932 +0000 UTC m=+192.085874278" Apr 24 21:29:45.720533 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:45.720489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" event={"ID":"d53ae6be-1641-41bb-8724-bcfe224ed319","Type":"ContainerStarted","Data":"69972fa6e9ac3cd4f7bd4738cf18370dabe76fe946b38e6b005a78607c6c1f27"} Apr 24 21:29:45.720944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:45.720538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" event={"ID":"d53ae6be-1641-41bb-8724-bcfe224ed319","Type":"ContainerStarted","Data":"88fec12b52b1a587aea4a23a692d5d7e1832b04ba4def63c6c5dca15efab83bb"} Apr 24 21:29:45.749265 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:45.749209 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jx79l" podStartSLOduration=1.330644742 podStartE2EDuration="2.749195876s" podCreationTimestamp="2026-04-24 21:29:43 +0000 UTC" firstStartedPulling="2026-04-24 21:29:43.800992226 +0000 UTC m=+191.147898550" lastFinishedPulling="2026-04-24 21:29:45.219543351 +0000 UTC m=+192.566449684" observedRunningTime="2026-04-24 21:29:45.749123495 +0000 UTC m=+193.096029842" watchObservedRunningTime="2026-04-24 21:29:45.749195876 +0000 UTC m=+193.096102222" Apr 24 21:29:47.777924 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.777890 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cf4ps"] Apr 24 21:29:47.781047 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.781030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.784107 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.784082 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:29:47.784238 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.784160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:29:47.784917 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.784895 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wdkvb\"" Apr 24 21:29:47.785045 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.784984 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:29:47.799749 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.799724 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cf4ps"] Apr 24 21:29:47.801609 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.801590 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7l4np"] Apr 24 21:29:47.804924 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.804906 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.807844 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.807822 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:29:47.809609 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.809589 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:29:47.809786 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.809768 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-stmx8\"" Apr 24 21:29:47.809849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.809833 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:29:47.967171 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-tls\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967171 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-metrics-client-ca\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-sys\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-wtmp\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-textfile\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967446 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967408 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fp8j\" (UniqueName: \"kubernetes.io/projected/079d5544-a12a-4f44-b625-ddbc27905004-kube-api-access-5fp8j\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-root\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/079d5544-a12a-4f44-b625-ddbc27905004-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qglwz\" (UniqueName: \"kubernetes.io/projected/e5056461-495e-4986-b3fb-3519148ed518-kube-api-access-qglwz\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:47.967915 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:47.967741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069164 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fp8j\" (UniqueName: \"kubernetes.io/projected/079d5544-a12a-4f44-b625-ddbc27905004-kube-api-access-5fp8j\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069164 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-root\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/079d5544-a12a-4f44-b625-ddbc27905004-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-root\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qglwz\" (UniqueName: \"kubernetes.io/projected/e5056461-495e-4986-b3fb-3519148ed518-kube-api-access-qglwz\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069384 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:48.069372 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:48.069431 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls podName:079d5544-a12a-4f44-b625-ddbc27905004 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:48.569411269 +0000 UTC m=+195.916317604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-cf4ps" (UID: "079d5544-a12a-4f44-b625-ddbc27905004") : secret "kube-state-metrics-tls" not found Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-tls\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-metrics-client-ca\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-sys\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-wtmp\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.069665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-textfile\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.070106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/079d5544-a12a-4f44-b625-ddbc27905004-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.070106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069891 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-wtmp\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.069938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e5056461-495e-4986-b3fb-3519148ed518-sys\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070302 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.070245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-metrics-client-ca\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070497 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.070468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.070579 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.070541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-textfile\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070648 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.070617 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.070965 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.070943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/079d5544-a12a-4f44-b625-ddbc27905004-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.071983 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.071956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-tls\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.072248 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.072228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e5056461-495e-4986-b3fb-3519148ed518-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.072291 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.072233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.085247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.085218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qglwz\" (UniqueName: \"kubernetes.io/projected/e5056461-495e-4986-b3fb-3519148ed518-kube-api-access-qglwz\") pod \"node-exporter-7l4np\" (UID: \"e5056461-495e-4986-b3fb-3519148ed518\") " pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.087209 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.087189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fp8j\" (UniqueName: \"kubernetes.io/projected/079d5544-a12a-4f44-b625-ddbc27905004-kube-api-access-5fp8j\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.114007 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.113980 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7l4np" Apr 24 21:29:48.121450 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:48.121424 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5056461_495e_4986_b3fb_3519148ed518.slice/crio-e9fb6886d5f12e7d04c61a7225584c83feeeb36b4a796bf40ac42545801ee6dd WatchSource:0}: Error finding container e9fb6886d5f12e7d04c61a7225584c83feeeb36b4a796bf40ac42545801ee6dd: Status 404 returned error can't find the container with id e9fb6886d5f12e7d04c61a7225584c83feeeb36b4a796bf40ac42545801ee6dd Apr 24 21:29:48.575451 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.575412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.578209 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.578183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/079d5544-a12a-4f44-b625-ddbc27905004-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cf4ps\" (UID: \"079d5544-a12a-4f44-b625-ddbc27905004\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.690478 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.690438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" Apr 24 21:29:48.729528 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.729482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l4np" event={"ID":"e5056461-495e-4986-b3fb-3519148ed518","Type":"ContainerStarted","Data":"e9fb6886d5f12e7d04c61a7225584c83feeeb36b4a796bf40ac42545801ee6dd"} Apr 24 21:29:48.910372 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:48.910322 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cf4ps"] Apr 24 21:29:48.913091 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:48.913068 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod079d5544_a12a_4f44_b625_ddbc27905004.slice/crio-4f184d33ade238c2bd9a6615337d205698200dec693d71351937905d95da197d WatchSource:0}: Error finding container 4f184d33ade238c2bd9a6615337d205698200dec693d71351937905d95da197d: Status 404 returned error can't find the container with id 4f184d33ade238c2bd9a6615337d205698200dec693d71351937905d95da197d Apr 24 21:29:49.733244 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.733205 2574 generic.go:358] "Generic (PLEG): container finished" podID="e5056461-495e-4986-b3fb-3519148ed518" containerID="cf42042b7fcf184f2fa3a7ee91d5562868c9cb08220060193a4fdcc6ca036849" exitCode=0 Apr 24 21:29:49.733426 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.733286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l4np" event={"ID":"e5056461-495e-4986-b3fb-3519148ed518","Type":"ContainerDied","Data":"cf42042b7fcf184f2fa3a7ee91d5562868c9cb08220060193a4fdcc6ca036849"} Apr 24 21:29:49.734431 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.734411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" event={"ID":"079d5544-a12a-4f44-b625-ddbc27905004","Type":"ContainerStarted","Data":"4f184d33ade238c2bd9a6615337d205698200dec693d71351937905d95da197d"} Apr 24 21:29:49.821437 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.821400 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6"] Apr 24 21:29:49.825565 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.825544 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828230 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bobhft936f674\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828271 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828285 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828323 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-zgz9s\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828426 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 21:29:49.828505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.828433 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 21:29:49.836104 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.836084 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6"] Apr 24 21:29:49.886495 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886694 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886694 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdf9z\" (UniqueName: \"kubernetes.io/projected/914ee558-60df-40ee-a269-3bed78eff9a0-kube-api-access-vdf9z\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886694 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886694 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/914ee558-60df-40ee-a269-3bed78eff9a0-metrics-client-ca\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886694 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886660 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886904 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-grpc-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.886904 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.886739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.987909 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.987816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.987909 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.987879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.987958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdf9z\" (UniqueName: \"kubernetes.io/projected/914ee558-60df-40ee-a269-3bed78eff9a0-kube-api-access-vdf9z\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.987994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.988016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/914ee558-60df-40ee-a269-3bed78eff9a0-metrics-client-ca\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.988050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.988108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-grpc-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.988499 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.988144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.989107 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.989040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/914ee558-60df-40ee-a269-3bed78eff9a0-metrics-client-ca\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.991228 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.991189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.991228 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.991221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.991427 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.991395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.991489 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.991426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.991691 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.991664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.992382 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.992360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/914ee558-60df-40ee-a269-3bed78eff9a0-secret-grpc-tls\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:49.995891 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:49.995868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdf9z\" (UniqueName: \"kubernetes.io/projected/914ee558-60df-40ee-a269-3bed78eff9a0-kube-api-access-vdf9z\") pod \"thanos-querier-6b5dc768d8-mcjn6\" (UID: \"914ee558-60df-40ee-a269-3bed78eff9a0\") " pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:50.136643 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.136610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:50.305625 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.305567 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6"] Apr 24 21:29:50.310262 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:50.310230 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914ee558_60df_40ee_a269_3bed78eff9a0.slice/crio-e7dbf97c48ead5e881d0c75600e47c3d9e11dabdecf4d224335a6af0585051e6 WatchSource:0}: Error finding container e7dbf97c48ead5e881d0c75600e47c3d9e11dabdecf4d224335a6af0585051e6: Status 404 returned error can't find the container with id e7dbf97c48ead5e881d0c75600e47c3d9e11dabdecf4d224335a6af0585051e6 Apr 24 21:29:50.740596 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.740551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" event={"ID":"079d5544-a12a-4f44-b625-ddbc27905004","Type":"ContainerStarted","Data":"aadd7762568698022b6fda3854f34a0d9338dabaf19e3ab7e904e86cb4da1910"} Apr 24 21:29:50.740749 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.740596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" event={"ID":"079d5544-a12a-4f44-b625-ddbc27905004","Type":"ContainerStarted","Data":"751c2be42cfddc8582e65e4a1a51c0ac30f4b2962f499a65dd1fceb41c7d973f"} Apr 24 21:29:50.740749 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.740622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" event={"ID":"079d5544-a12a-4f44-b625-ddbc27905004","Type":"ContainerStarted","Data":"330a13b4359f2f46b4bbb81042288d4b5626290b0bf723d048420fb4ba40791f"} Apr 24 21:29:50.742647 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.742596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"e7dbf97c48ead5e881d0c75600e47c3d9e11dabdecf4d224335a6af0585051e6"} Apr 24 21:29:50.744742 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.744715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l4np" event={"ID":"e5056461-495e-4986-b3fb-3519148ed518","Type":"ContainerStarted","Data":"789ce9bde28ccd8189b6819b0f1e3eb2ad424d77e7f645c41067ee1ff872ef02"} Apr 24 21:29:50.744879 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.744748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l4np" event={"ID":"e5056461-495e-4986-b3fb-3519148ed518","Type":"ContainerStarted","Data":"7123aba53f71fdacabe32a068aab52d1dc5d3c1f915678ec47be8e300c22b46e"} Apr 24 21:29:50.767073 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.767018 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-cf4ps" podStartSLOduration=2.624595033 podStartE2EDuration="3.767004156s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="2026-04-24 21:29:48.915005059 +0000 UTC m=+196.261911398" lastFinishedPulling="2026-04-24 21:29:50.057414181 +0000 UTC m=+197.404320521" observedRunningTime="2026-04-24 21:29:50.765674183 +0000 UTC m=+198.112580529" watchObservedRunningTime="2026-04-24 21:29:50.767004156 +0000 UTC m=+198.113910501" Apr 24 21:29:50.787985 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:50.787940 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7l4np" podStartSLOduration=3.078103338 podStartE2EDuration="3.787925654s" podCreationTimestamp="2026-04-24 21:29:47 +0000 UTC" firstStartedPulling="2026-04-24 21:29:48.123129317 +0000 UTC m=+195.470035642" lastFinishedPulling="2026-04-24 21:29:48.832951633 +0000 UTC m=+196.179857958" observedRunningTime="2026-04-24 21:29:50.786853001 +0000 UTC m=+198.133759364" watchObservedRunningTime="2026-04-24 21:29:50.787925654 +0000 UTC m=+198.134832001" Apr 24 21:29:52.476443 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.476396 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl"] Apr 24 21:29:52.480045 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.480024 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:52.484462 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.484436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 21:29:52.484604 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.484499 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-wwrzf\"" Apr 24 21:29:52.494928 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.494883 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl"] Apr 24 21:29:52.511769 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.511686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dc4hl\" (UID: \"8b155916-68a3-40b1-8d71-903f176840f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:52.612290 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.612264 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dc4hl\" (UID: \"8b155916-68a3-40b1-8d71-903f176840f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:52.612453 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:52.612433 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 21:29:52.612503 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:52.612496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert podName:8b155916-68a3-40b1-8d71-903f176840f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:53.112481046 +0000 UTC m=+200.459387370 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-dc4hl" (UID: "8b155916-68a3-40b1-8d71-903f176840f4") : secret "monitoring-plugin-cert" not found Apr 24 21:29:52.758829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758793 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"a6a0986aaaca971831286103b61a2efc49cd4a6615b8b7a04ed916d5da94008a"} Apr 24 21:29:52.758829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"7d055e75ab394e44b69dcb7cfe1dbe37c15aad885db5e69885fb3654f1c01385"} Apr 24 21:29:52.759028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758846 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"34ad3d78ca4de6b233cc2884134eca71f559bda18de6a44463318ce6d603e79b"} Apr 24 21:29:52.759028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758858 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"1101891e47fbc315ef91e6fe096d8de0af044b1f90316b15a6921832c918ec91"} Apr 24 21:29:52.759028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"ce56b03ccd89888e4d9bf0568dfa737eaf201dacf2973a715fbe13f695bd8418"} Apr 24 21:29:52.759028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.758876 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" event={"ID":"914ee558-60df-40ee-a269-3bed78eff9a0","Type":"ContainerStarted","Data":"d87a12d4271f5f348175a9a437103a6042cbe8fe372ef23b0efff1928d07ada6"} Apr 24 21:29:52.759028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.759007 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:52.785100 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:52.785038 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" podStartSLOduration=1.526114573 podStartE2EDuration="3.78501713s" podCreationTimestamp="2026-04-24 21:29:49 +0000 UTC" firstStartedPulling="2026-04-24 21:29:50.312276483 +0000 UTC m=+197.659182807" lastFinishedPulling="2026-04-24 21:29:52.57117904 +0000 UTC m=+199.918085364" observedRunningTime="2026-04-24 21:29:52.783464027 +0000 UTC m=+200.130370397" watchObservedRunningTime="2026-04-24 21:29:52.78501713 +0000 UTC m=+200.131923481" Apr 24 21:29:53.116543 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.116438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dc4hl\" (UID: \"8b155916-68a3-40b1-8d71-903f176840f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:53.118973 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.118953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b155916-68a3-40b1-8d71-903f176840f4-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-dc4hl\" (UID: \"8b155916-68a3-40b1-8d71-903f176840f4\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:53.144630 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.144604 2574 scope.go:117] "RemoveContainer" containerID="8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082" Apr 24 21:29:53.144808 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:29:53.144792 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-jpjmc_openshift-console-operator(219a5443-bbde-4ab4-bb73-46a6160644d2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podUID="219a5443-bbde-4ab4-bb73-46a6160644d2" Apr 24 21:29:53.390576 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.390498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:53.516607 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.516574 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl"] Apr 24 21:29:53.519108 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:53.519083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b155916_68a3_40b1_8d71_903f176840f4.slice/crio-61d3f4d2450e98b7d52023426a2ae64aa0f121abcd107327776aab4c69ab650f WatchSource:0}: Error finding container 61d3f4d2450e98b7d52023426a2ae64aa0f121abcd107327776aab4c69ab650f: Status 404 returned error can't find the container with id 61d3f4d2450e98b7d52023426a2ae64aa0f121abcd107327776aab4c69ab650f Apr 24 21:29:53.763263 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:53.763222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" event={"ID":"8b155916-68a3-40b1-8d71-903f176840f4","Type":"ContainerStarted","Data":"61d3f4d2450e98b7d52023426a2ae64aa0f121abcd107327776aab4c69ab650f"} Apr 24 21:29:54.038794 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.038717 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:54.044043 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.044016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.046695 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.046507 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:29:54.048738 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.048713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:29:54.048849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.048713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:29:54.049081 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049062 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:29:54.049122 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049085 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:29:54.049225 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mlklr\"" Apr 24 21:29:54.049314 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ntcg8h599110\"" Apr 24 21:29:54.049545 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:29:54.049623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:29:54.049673 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.049634 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:29:54.050018 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.050003 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:29:54.050310 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.050296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:29:54.051043 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.051026 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:29:54.052065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.052045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:29:54.054190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.054170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:29:54.058904 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.058882 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:54.126030 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.125991 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126218 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126235 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxh59\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.126753 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.127042 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.126789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228207 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228423 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228423 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228423 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228423 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.228629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.228615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.229509 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.229475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230537 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230537 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230537 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.230712 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.230711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxh59\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.231937 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.231909 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.232678 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.232205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.232678 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.232523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.233713 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.233200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.233713 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.233372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.233713 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.233666 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.235424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.235369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239064 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.238962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239064 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239064 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239064 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239361 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239361 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.239473 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.239426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.247655 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.247625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxh59\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59\") pod \"prometheus-k8s-0\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.355976 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.355861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:29:54.514643 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.514403 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:29:54.517869 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:29:54.517835 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75be8ed3_6ddf_499c_b88f_c337ade73abc.slice/crio-feea61d90d7dc16d9b427bf339f3cde7dbe93e0d7d4a4f65a93e7e72b3eccb71 WatchSource:0}: Error finding container feea61d90d7dc16d9b427bf339f3cde7dbe93e0d7d4a4f65a93e7e72b3eccb71: Status 404 returned error can't find the container with id feea61d90d7dc16d9b427bf339f3cde7dbe93e0d7d4a4f65a93e7e72b3eccb71 Apr 24 21:29:54.768975 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:54.768928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"feea61d90d7dc16d9b427bf339f3cde7dbe93e0d7d4a4f65a93e7e72b3eccb71"} Apr 24 21:29:55.773046 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.772947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" event={"ID":"8b155916-68a3-40b1-8d71-903f176840f4","Type":"ContainerStarted","Data":"a9bb3dadbbb91ad89f00c5a4838843ab8b0d57bce82830da2cf8e0ecd35cc880"} Apr 24 21:29:55.773509 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.773108 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:55.774572 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.774545 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a" exitCode=0 Apr 24 21:29:55.774704 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.774583 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a"} Apr 24 21:29:55.778494 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.778470 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" Apr 24 21:29:55.809746 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:55.809701 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-dc4hl" podStartSLOduration=2.398339962 podStartE2EDuration="3.809686215s" podCreationTimestamp="2026-04-24 21:29:52 +0000 UTC" firstStartedPulling="2026-04-24 21:29:53.520916763 +0000 UTC m=+200.867823086" lastFinishedPulling="2026-04-24 21:29:54.93226301 +0000 UTC m=+202.279169339" observedRunningTime="2026-04-24 21:29:55.791681379 +0000 UTC m=+203.138587724" watchObservedRunningTime="2026-04-24 21:29:55.809686215 +0000 UTC m=+203.156592560" Apr 24 21:29:58.771518 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:58.771481 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6b5dc768d8-mcjn6" Apr 24 21:29:58.785874 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:58.785835 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc"} Apr 24 21:29:58.786053 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:58.785882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228"} Apr 24 21:29:58.786053 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:58.785898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009"} Apr 24 21:29:58.786053 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:58.785911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0"} Apr 24 21:29:59.791072 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:59.791031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e"} Apr 24 21:29:59.791072 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:59.791068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerStarted","Data":"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca"} Apr 24 21:29:59.824835 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:29:59.824766 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.885858014 podStartE2EDuration="5.824743575s" podCreationTimestamp="2026-04-24 21:29:54 +0000 UTC" firstStartedPulling="2026-04-24 21:29:54.521272419 +0000 UTC m=+201.868178758" lastFinishedPulling="2026-04-24 21:29:58.460157994 +0000 UTC m=+205.807064319" observedRunningTime="2026-04-24 21:29:59.82243203 +0000 UTC m=+207.169338375" watchObservedRunningTime="2026-04-24 21:29:59.824743575 +0000 UTC m=+207.171649922" Apr 24 21:30:02.985647 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:02.985609 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cc4869f54-mx6ks"] Apr 24 21:30:02.986010 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:30:02.985825 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" podUID="82766c41-fe9c-427b-8e96-271dc66e21d8" Apr 24 21:30:03.804774 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.804740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:30:03.810534 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.810513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:30:03.924317 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924281 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924351 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924423 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924442 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924464 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924510 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924747 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924531 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzgd5\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5\") pod \"82766c41-fe9c-427b-8e96-271dc66e21d8\" (UID: \"82766c41-fe9c-427b-8e96-271dc66e21d8\") " Apr 24 21:30:03.924808 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924773 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:30:03.924863 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924803 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:03.924953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.924930 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:30:03.926770 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.926732 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:03.926989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.926966 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5" (OuterVolumeSpecName: "kube-api-access-pzgd5") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "kube-api-access-pzgd5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:03.927070 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.926973 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:30:03.927070 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:03.926999 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "82766c41-fe9c-427b-8e96-271dc66e21d8" (UID: "82766c41-fe9c-427b-8e96-271dc66e21d8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:30:04.025660 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025624 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-certificates\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.025660 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025658 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82766c41-fe9c-427b-8e96-271dc66e21d8-ca-trust-extracted\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.026190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025673 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82766c41-fe9c-427b-8e96-271dc66e21d8-trusted-ca\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.026190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025687 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-bound-sa-token\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.026190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025700 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzgd5\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-kube-api-access-pzgd5\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.026190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025713 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-installation-pull-secrets\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.026190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.025727 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/82766c41-fe9c-427b-8e96-271dc66e21d8-image-registry-private-configuration\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:04.356316 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.356286 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:04.807796 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.807767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cc4869f54-mx6ks" Apr 24 21:30:04.847505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.847468 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cc4869f54-mx6ks"] Apr 24 21:30:04.854187 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.854158 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-cc4869f54-mx6ks"] Apr 24 21:30:04.934308 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:04.934264 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82766c41-fe9c-427b-8e96-271dc66e21d8-registry-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:30:05.146945 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:05.146862 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82766c41-fe9c-427b-8e96-271dc66e21d8" path="/var/lib/kubelet/pods/82766c41-fe9c-427b-8e96-271dc66e21d8/volumes" Apr 24 21:30:07.143489 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.143456 2574 scope.go:117] "RemoveContainer" containerID="8d29c6b9b7f3bc1c27e5abca8007265ef97ee568edf4431a47202d3cec17c082" Apr 24 21:30:07.817434 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.817408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:30:07.817640 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.817493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" event={"ID":"219a5443-bbde-4ab4-bb73-46a6160644d2","Type":"ContainerStarted","Data":"b8297f9550cbba1b33450063f275a5cf1d9eb07ccf2c545ad5adb68adf964065"} Apr 24 21:30:07.817791 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.817762 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:30:07.840193 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.840131 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" podStartSLOduration=53.929495963 podStartE2EDuration="55.840111998s" podCreationTimestamp="2026-04-24 21:29:12 +0000 UTC" firstStartedPulling="2026-04-24 21:29:12.509235897 +0000 UTC m=+159.856142221" lastFinishedPulling="2026-04-24 21:29:14.419851932 +0000 UTC m=+161.766758256" observedRunningTime="2026-04-24 21:30:07.838978885 +0000 UTC m=+215.185885233" watchObservedRunningTime="2026-04-24 21:30:07.840111998 +0000 UTC m=+215.187018345" Apr 24 21:30:07.884444 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:07.884414 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-jpjmc" Apr 24 21:30:38.279602 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:38.279570 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g2zr8_eab474e7-7b20-4ca7-aedb-49915bb5ec3e/dns-node-resolver/0.log" Apr 24 21:30:44.983290 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:44.983192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:30:44.985653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:44.985627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2af22c1-baca-4054-87ff-daf77606438a-metrics-certs\") pod \"network-metrics-daemon-jtqkc\" (UID: \"e2af22c1-baca-4054-87ff-daf77606438a\") " pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:30:45.247407 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:45.247297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5kts7\"" Apr 24 21:30:45.254939 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:45.254916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jtqkc" Apr 24 21:30:45.375607 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:45.375389 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jtqkc"] Apr 24 21:30:45.378121 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:30:45.378088 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2af22c1_baca_4054_87ff_daf77606438a.slice/crio-35b744754e527238a6030be92f78d697eb490868975c3fb814af0519662fd14d WatchSource:0}: Error finding container 35b744754e527238a6030be92f78d697eb490868975c3fb814af0519662fd14d: Status 404 returned error can't find the container with id 35b744754e527238a6030be92f78d697eb490868975c3fb814af0519662fd14d Apr 24 21:30:45.920819 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:45.920781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jtqkc" event={"ID":"e2af22c1-baca-4054-87ff-daf77606438a","Type":"ContainerStarted","Data":"35b744754e527238a6030be92f78d697eb490868975c3fb814af0519662fd14d"} Apr 24 21:30:46.926273 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:46.926228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jtqkc" event={"ID":"e2af22c1-baca-4054-87ff-daf77606438a","Type":"ContainerStarted","Data":"339911453fb5b1db2e42b02cb7c7348c75bad3d8c19528784019817ac6f73b99"} Apr 24 21:30:46.926273 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:46.926272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jtqkc" event={"ID":"e2af22c1-baca-4054-87ff-daf77606438a","Type":"ContainerStarted","Data":"5fb71731a66072c3996f4862e19374743af29c011a994db62dce6e68c78a8523"} Apr 24 21:30:46.944127 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:46.944077 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jtqkc" podStartSLOduration=253.000191318 podStartE2EDuration="4m13.944061835s" podCreationTimestamp="2026-04-24 21:26:33 +0000 UTC" firstStartedPulling="2026-04-24 21:30:45.380036572 +0000 UTC m=+252.726942895" lastFinishedPulling="2026-04-24 21:30:46.323907083 +0000 UTC m=+253.670813412" observedRunningTime="2026-04-24 21:30:46.942974647 +0000 UTC m=+254.289880994" watchObservedRunningTime="2026-04-24 21:30:46.944061835 +0000 UTC m=+254.290968180" Apr 24 21:30:54.356502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:54.356452 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:54.375871 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:54.375841 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:30:54.971257 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:30:54.971232 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:11.605082 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:11.605020 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" podUID="1cd5a898-ba76-4c36-ab46-16db7f1b61bd" Apr 24 21:31:11.605082 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:11.605020 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zq6rm" podUID="645772fe-eb62-4443-a6dd-6a10b3593053" Apr 24 21:31:11.605082 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:11.605017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-drrd5" podUID="ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399" Apr 24 21:31:12.003543 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.003514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:31:12.003736 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.003514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:31:12.003736 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.003520 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:12.659821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.659781 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:12.660458 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.660423 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="prometheus" containerID="cri-o://1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0" gracePeriod=600 Apr 24 21:31:12.661792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.660795 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e" gracePeriod=600 Apr 24 21:31:12.661792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.660842 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy" containerID="cri-o://46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca" gracePeriod=600 Apr 24 21:31:12.661792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.660927 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="thanos-sidecar" containerID="cri-o://32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228" gracePeriod=600 Apr 24 21:31:12.662412 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.660445 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-web" containerID="cri-o://0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc" gracePeriod=600 Apr 24 21:31:12.662412 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:12.662001 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="config-reloader" containerID="cri-o://09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009" gracePeriod=600 Apr 24 21:31:13.010294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010263 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e" exitCode=0 Apr 24 21:31:13.010294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010289 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca" exitCode=0 Apr 24 21:31:13.010294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010296 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228" exitCode=0 Apr 24 21:31:13.010294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010301 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009" exitCode=0 Apr 24 21:31:13.010294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010306 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0" exitCode=0 Apr 24 21:31:13.010626 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e"} Apr 24 21:31:13.010626 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca"} Apr 24 21:31:13.010626 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228"} Apr 24 21:31:13.010626 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009"} Apr 24 21:31:13.010626 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.010418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0"} Apr 24 21:31:13.896177 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:13.896153 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.015782 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.015692 2574 generic.go:358] "Generic (PLEG): container finished" podID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerID="0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc" exitCode=0 Apr 24 21:31:14.015956 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.015796 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.015956 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.015794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc"} Apr 24 21:31:14.015956 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.015845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"75be8ed3-6ddf-499c-b88f-c337ade73abc","Type":"ContainerDied","Data":"feea61d90d7dc16d9b427bf339f3cde7dbe93e0d7d4a4f65a93e7e72b3eccb71"} Apr 24 21:31:14.015956 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.015866 2574 scope.go:117] "RemoveContainer" containerID="ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e" Apr 24 21:31:14.023547 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.023530 2574 scope.go:117] "RemoveContainer" containerID="46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca" Apr 24 21:31:14.030041 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.030024 2574 scope.go:117] "RemoveContainer" containerID="0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc" Apr 24 21:31:14.036393 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.036377 2574 scope.go:117] "RemoveContainer" containerID="32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228" Apr 24 21:31:14.042450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.042433 2574 scope.go:117] "RemoveContainer" containerID="09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009" Apr 24 21:31:14.048795 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.048776 2574 scope.go:117] "RemoveContainer" containerID="1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0" Apr 24 21:31:14.049653 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049638 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049714 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049667 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049714 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049698 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049787 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049717 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049787 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049741 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049787 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049771 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.049927 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.049805 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050086 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:14.050118 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050110 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050258 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050143 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050258 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050171 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050258 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050199 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050258 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050232 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050307 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050363 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050390 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxh59\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050444 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050505 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050476 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050752 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050517 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file\") pod \"75be8ed3-6ddf-499c-b88f-c337ade73abc\" (UID: \"75be8ed3-6ddf-499c-b88f-c337ade73abc\") " Apr 24 21:31:14.050752 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050722 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:14.050860 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050811 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.050860 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.050830 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.051462 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.051114 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:14.051558 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.051525 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:14.052610 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.052574 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:14.052872 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.052846 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:31:14.053963 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.053930 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config" (OuterVolumeSpecName: "config") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.054171 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.054120 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:14.054704 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.054558 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.054704 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.054678 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.055140 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055100 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.055140 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055127 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.055302 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055150 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.055302 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055212 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out" (OuterVolumeSpecName: "config-out") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:14.055671 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055650 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.055876 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.055860 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59" (OuterVolumeSpecName: "kube-api-access-vxh59") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "kube-api-access-vxh59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:14.056353 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.056307 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.063833 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.063808 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config" (OuterVolumeSpecName: "web-config") pod "75be8ed3-6ddf-499c-b88f-c337ade73abc" (UID: "75be8ed3-6ddf-499c-b88f-c337ade73abc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:31:14.071892 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.071874 2574 scope.go:117] "RemoveContainer" containerID="5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a" Apr 24 21:31:14.078582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.078559 2574 scope.go:117] "RemoveContainer" containerID="ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e" Apr 24 21:31:14.078848 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.078826 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e\": container with ID starting with ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e not found: ID does not exist" containerID="ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e" Apr 24 21:31:14.078892 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.078861 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e"} err="failed to get container status \"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e\": rpc error: code = NotFound desc = could not find container \"ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e\": container with ID starting with ebf388f1d61b2f1367d15094ffe6ac012c206f98b50479748a783653d30b914e not found: ID does not exist" Apr 24 21:31:14.078935 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.078895 2574 scope.go:117] "RemoveContainer" containerID="46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca" Apr 24 21:31:14.079111 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.079098 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca\": container with ID starting with 46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca not found: ID does not exist" containerID="46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca" Apr 24 21:31:14.079149 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079114 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca"} err="failed to get container status \"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca\": rpc error: code = NotFound desc = could not find container \"46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca\": container with ID starting with 46e0bb2d1d1ab4c7db3768836d344972abc291da89bd10494ac6a48ec99607ca not found: ID does not exist" Apr 24 21:31:14.079149 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079126 2574 scope.go:117] "RemoveContainer" containerID="0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc" Apr 24 21:31:14.079370 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.079351 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc\": container with ID starting with 0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc not found: ID does not exist" containerID="0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc" Apr 24 21:31:14.079441 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079372 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc"} err="failed to get container status \"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc\": rpc error: code = NotFound desc = could not find container \"0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc\": container with ID starting with 0f2dbe5f2801acb55acba95793ecbc68eaa21b0f755002a802d56b3efa5b0bdc not found: ID does not exist" Apr 24 21:31:14.079441 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079384 2574 scope.go:117] "RemoveContainer" containerID="32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228" Apr 24 21:31:14.079593 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.079577 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228\": container with ID starting with 32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228 not found: ID does not exist" containerID="32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228" Apr 24 21:31:14.079630 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079596 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228"} err="failed to get container status \"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228\": rpc error: code = NotFound desc = could not find container \"32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228\": container with ID starting with 32df60ec4b5e89e68338434e0cb6eae30b0de36d1f88ec0bd61e4cf86cd5b228 not found: ID does not exist" Apr 24 21:31:14.079630 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079607 2574 scope.go:117] "RemoveContainer" containerID="09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009" Apr 24 21:31:14.079805 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.079791 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009\": container with ID starting with 09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009 not found: ID does not exist" containerID="09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009" Apr 24 21:31:14.079843 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079807 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009"} err="failed to get container status \"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009\": rpc error: code = NotFound desc = could not find container \"09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009\": container with ID starting with 09498e7e3a824d68895c390443a895a73729f805b6fef8afd2bd1c9658c4e009 not found: ID does not exist" Apr 24 21:31:14.079843 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.079817 2574 scope.go:117] "RemoveContainer" containerID="1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0" Apr 24 21:31:14.080041 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.080020 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0\": container with ID starting with 1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0 not found: ID does not exist" containerID="1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0" Apr 24 21:31:14.080089 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.080047 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0"} err="failed to get container status \"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0\": rpc error: code = NotFound desc = could not find container \"1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0\": container with ID starting with 1b24caf208b6a34e9d7d0d4fbbd5c254d3bd321cf53c4fcdfee2a7e6e4d11cb0 not found: ID does not exist" Apr 24 21:31:14.080089 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.080065 2574 scope.go:117] "RemoveContainer" containerID="5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a" Apr 24 21:31:14.080250 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:31:14.080235 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a\": container with ID starting with 5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a not found: ID does not exist" containerID="5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a" Apr 24 21:31:14.080292 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.080254 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a"} err="failed to get container status \"5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a\": rpc error: code = NotFound desc = could not find container \"5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a\": container with ID starting with 5796b667c2e20c21c11e2ed8268be22595ebbde84fd837410e51a22f1b0c154a not found: ID does not exist" Apr 24 21:31:14.152127 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152094 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-tls-assets\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152127 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152121 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152127 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152132 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152143 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-config\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152152 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152162 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152171 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-web-config\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152180 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-metrics-client-certs\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152188 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-configmap-metrics-client-ca\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152196 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152205 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-grpc-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152213 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75be8ed3-6ddf-499c-b88f-c337ade73abc-secret-kube-rbac-proxy\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152222 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-k8s-db\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152230 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75be8ed3-6ddf-499c-b88f-c337ade73abc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152240 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/75be8ed3-6ddf-499c-b88f-c337ade73abc-config-out\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.152374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.152248 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxh59\" (UniqueName: \"kubernetes.io/projected/75be8ed3-6ddf-499c-b88f-c337ade73abc-kube-api-access-vxh59\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.343255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.343221 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:14.350770 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.350744 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:14.379909 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.379879 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:14.380211 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380197 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="config-reloader" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="config-reloader" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380223 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="thanos-sidecar" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380232 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="thanos-sidecar" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380244 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380249 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380262 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="init-config-reloader" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380268 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="init-config-reloader" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380276 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="prometheus" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380282 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="prometheus" Apr 24 21:31:14.380284 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380288 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380295 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380307 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-web" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380315 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-web" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380372 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-thanos" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380383 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="config-reloader" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380394 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy-web" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380403 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="kube-rbac-proxy" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="thanos-sidecar" Apr 24 21:31:14.380624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.380415 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" containerName="prometheus" Apr 24 21:31:14.384230 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.384211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.386867 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.386843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 21:31:14.386974 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.386955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 21:31:14.387051 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.386955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 21:31:14.387106 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387054 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 21:31:14.387304 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387277 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 21:31:14.387491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387320 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 21:31:14.387491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387351 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 21:31:14.387491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387366 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mlklr\"" Apr 24 21:31:14.387815 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387799 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 21:31:14.387930 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387888 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:31:14.388005 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.387928 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 21:31:14.388201 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.388188 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ntcg8h599110\"" Apr 24 21:31:14.388290 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.388252 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 21:31:14.390279 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.390261 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 21:31:14.393297 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.393274 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 21:31:14.397773 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.397751 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:14.556403 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556403 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556623 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79l4f\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-kube-api-access-79l4f\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.556821 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.557105 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.556837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.657776 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.657776 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.657776 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658014 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.657983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658172 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658409 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658409 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658409 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658409 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658284 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658661 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658728 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.658728 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.658710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79l4f\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-kube-api-access-79l4f\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.660968 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.660945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661076 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661152 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661080 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661152 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661152 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-web-config\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661293 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-config-out\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.661293 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662065 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.661971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.662266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.662527 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.662500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.663747 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.663729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.663825 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.663787 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.664005 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.663987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.664267 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.664251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.668258 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.668236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79l4f\" (UniqueName: \"kubernetes.io/projected/69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7-kube-api-access-79l4f\") pod \"prometheus-k8s-0\" (UID: \"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.694368 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.694321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:14.826832 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:14.826715 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 21:31:14.829621 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:31:14.829593 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c2a0ff_61af_4bfd_9c4b_89e876dcc4c7.slice/crio-cf3c23909a6c1fb174d459494de7a31725fb0eae0316454cd845d8587d5132d4 WatchSource:0}: Error finding container cf3c23909a6c1fb174d459494de7a31725fb0eae0316454cd845d8587d5132d4: Status 404 returned error can't find the container with id cf3c23909a6c1fb174d459494de7a31725fb0eae0316454cd845d8587d5132d4 Apr 24 21:31:15.020028 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.019994 2574 generic.go:358] "Generic (PLEG): container finished" podID="69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7" containerID="66357ddaf6c18bf27dab79033f715384e8dfbd0ea475c106cd5249e0bfa85159" exitCode=0 Apr 24 21:31:15.020483 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.020085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerDied","Data":"66357ddaf6c18bf27dab79033f715384e8dfbd0ea475c106cd5249e0bfa85159"} Apr 24 21:31:15.020483 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.020128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"cf3c23909a6c1fb174d459494de7a31725fb0eae0316454cd845d8587d5132d4"} Apr 24 21:31:15.062343 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.062302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:31:15.062491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.062369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:31:15.062491 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.062403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:15.064650 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.064628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399-metrics-tls\") pod \"dns-default-drrd5\" (UID: \"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399\") " pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:15.064816 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.064796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1cd5a898-ba76-4c36-ab46-16db7f1b61bd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jkddd\" (UID: \"1cd5a898-ba76-4c36-ab46-16db7f1b61bd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:31:15.064872 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.064812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/645772fe-eb62-4443-a6dd-6a10b3593053-cert\") pod \"ingress-canary-zq6rm\" (UID: \"645772fe-eb62-4443-a6dd-6a10b3593053\") " pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:31:15.151679 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.149126 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75be8ed3-6ddf-499c-b88f-c337ade73abc" path="/var/lib/kubelet/pods/75be8ed3-6ddf-499c-b88f-c337ade73abc/volumes" Apr 24 21:31:15.306993 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.306964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-42hwh\"" Apr 24 21:31:15.307157 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.306964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pxhww\"" Apr 24 21:31:15.307157 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.306964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mzt4d\"" Apr 24 21:31:15.315168 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.315137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:15.315478 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.315137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" Apr 24 21:31:15.315658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.315238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq6rm" Apr 24 21:31:15.523553 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.523517 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jkddd"] Apr 24 21:31:15.526806 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:31:15.526735 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd5a898_ba76_4c36_ab46_16db7f1b61bd.slice/crio-51ff99fea22cf4b558e5dccf0aa4d4d18ddfd0ded7c8cfda8683a4b39eeb87ce WatchSource:0}: Error finding container 51ff99fea22cf4b558e5dccf0aa4d4d18ddfd0ded7c8cfda8683a4b39eeb87ce: Status 404 returned error can't find the container with id 51ff99fea22cf4b558e5dccf0aa4d4d18ddfd0ded7c8cfda8683a4b39eeb87ce Apr 24 21:31:15.752182 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.752132 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-drrd5"] Apr 24 21:31:15.755779 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:15.755672 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zq6rm"] Apr 24 21:31:15.758325 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:31:15.758297 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff1a9672_5ebe_4d07_ac1a_56fbbdfdc399.slice/crio-ad08e0e784de698a46bafb42824f5fac4501f64f6317a7f03f2971cc1712d79c WatchSource:0}: Error finding container ad08e0e784de698a46bafb42824f5fac4501f64f6317a7f03f2971cc1712d79c: Status 404 returned error can't find the container with id ad08e0e784de698a46bafb42824f5fac4501f64f6317a7f03f2971cc1712d79c Apr 24 21:31:15.759538 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:31:15.759514 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod645772fe_eb62_4443_a6dd_6a10b3593053.slice/crio-557144deee011af48a61c51d754b4ab8cf257620f7a17c7985eff6c650fd2d0b WatchSource:0}: Error finding container 557144deee011af48a61c51d754b4ab8cf257620f7a17c7985eff6c650fd2d0b: Status 404 returned error can't find the container with id 557144deee011af48a61c51d754b4ab8cf257620f7a17c7985eff6c650fd2d0b Apr 24 21:31:16.026021 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.025924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-drrd5" event={"ID":"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399","Type":"ContainerStarted","Data":"ad08e0e784de698a46bafb42824f5fac4501f64f6317a7f03f2971cc1712d79c"} Apr 24 21:31:16.029323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"03e09a7ad30d123f36fa978476a6c7ab1bf19436d822a5ade4c35433ee694d85"} Apr 24 21:31:16.029323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"b6606cedefa78d84f538ee3636750eeb8d202b935bfa13a7de7ab6750871f07e"} Apr 24 21:31:16.029582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029361 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"93beed758a34628f744627e240f3a5c07deef8cd8d28c75bc940eff43479365f"} Apr 24 21:31:16.029582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"65746f5f3c162b9f2e5380c43f3829efcbe0183eba1f6d090f64c4e7167b02db"} Apr 24 21:31:16.029582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029383 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"ac8f851807545e085b0e57076545716174f32468fa1605f5bc22223664cbf59c"} Apr 24 21:31:16.029582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.029394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7","Type":"ContainerStarted","Data":"18f045461c81e5bcc365774018b14cb7133373ee4ab847606a63da1ca72189f0"} Apr 24 21:31:16.030652 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.030629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" event={"ID":"1cd5a898-ba76-4c36-ab46-16db7f1b61bd","Type":"ContainerStarted","Data":"51ff99fea22cf4b558e5dccf0aa4d4d18ddfd0ded7c8cfda8683a4b39eeb87ce"} Apr 24 21:31:16.031787 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.031756 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zq6rm" event={"ID":"645772fe-eb62-4443-a6dd-6a10b3593053","Type":"ContainerStarted","Data":"557144deee011af48a61c51d754b4ab8cf257620f7a17c7985eff6c650fd2d0b"} Apr 24 21:31:16.077754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:16.076453 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.076432115 podStartE2EDuration="2.076432115s" podCreationTimestamp="2026-04-24 21:31:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:16.073702486 +0000 UTC m=+283.420609076" watchObservedRunningTime="2026-04-24 21:31:16.076432115 +0000 UTC m=+283.423338465" Apr 24 21:31:17.036825 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:17.036782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" event={"ID":"1cd5a898-ba76-4c36-ab46-16db7f1b61bd","Type":"ContainerStarted","Data":"61271b1fecaa859d0ecdfc5cfe508c6f8374815c416894e0df930747451fd040"} Apr 24 21:31:17.060574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:17.060525 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jkddd" podStartSLOduration=282.073727487 podStartE2EDuration="4m43.060509844s" podCreationTimestamp="2026-04-24 21:26:34 +0000 UTC" firstStartedPulling="2026-04-24 21:31:15.52928283 +0000 UTC m=+282.876189154" lastFinishedPulling="2026-04-24 21:31:16.516065179 +0000 UTC m=+283.862971511" observedRunningTime="2026-04-24 21:31:17.056141773 +0000 UTC m=+284.403048120" watchObservedRunningTime="2026-04-24 21:31:17.060509844 +0000 UTC m=+284.407416189" Apr 24 21:31:18.043148 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:18.043036 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zq6rm" event={"ID":"645772fe-eb62-4443-a6dd-6a10b3593053","Type":"ContainerStarted","Data":"4c4b30a262e37deb3464fc58afc1295f166fb6ac3928345b10de62a891838701"} Apr 24 21:31:18.044678 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:18.044648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-drrd5" event={"ID":"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399","Type":"ContainerStarted","Data":"eec1f16353e72e4e9c7a132af5bd19f0f194c391efa0fc8092c4df8783f6137c"} Apr 24 21:31:18.044787 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:18.044685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-drrd5" event={"ID":"ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399","Type":"ContainerStarted","Data":"7d308187f555b0ae6d8b74c23a7f84843780da12a95d2ed9e0b09e8c7c67a270"} Apr 24 21:31:18.044857 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:18.044843 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:18.062928 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:18.062879 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zq6rm" podStartSLOduration=252.071219266 podStartE2EDuration="4m14.062864557s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="2026-04-24 21:31:15.761733612 +0000 UTC m=+283.108639936" lastFinishedPulling="2026-04-24 21:31:17.753378889 +0000 UTC m=+285.100285227" observedRunningTime="2026-04-24 21:31:18.0618025 +0000 UTC m=+285.408708846" watchObservedRunningTime="2026-04-24 21:31:18.062864557 +0000 UTC m=+285.409770903" Apr 24 21:31:19.695045 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:19.695003 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:31:28.049918 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:28.049883 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-drrd5" Apr 24 21:31:28.081564 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:28.081506 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-drrd5" podStartSLOduration=262.09098622 podStartE2EDuration="4m24.081478375s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="2026-04-24 21:31:15.760701798 +0000 UTC m=+283.107608122" lastFinishedPulling="2026-04-24 21:31:17.751193938 +0000 UTC m=+285.098100277" observedRunningTime="2026-04-24 21:31:18.083539003 +0000 UTC m=+285.430445370" watchObservedRunningTime="2026-04-24 21:31:28.081478375 +0000 UTC m=+295.428384720" Apr 24 21:31:33.091674 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:33.091268 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:31:33.091674 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:33.091320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:31:33.098883 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:33.098863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:31:33.099008 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:33.098971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:31:33.102260 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:31:33.102243 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:14.694910 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:32:14.694823 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:14.710151 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:32:14.710123 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:32:15.224820 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:32:15.224792 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 21:36:33.115250 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:33.115220 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:36:33.116666 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:33.116643 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:36:33.121148 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:33.121127 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:36:33.122297 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:33.122268 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:36:41.052385 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.052349 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-nk48p"] Apr 24 21:36:41.054475 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.054457 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nk48p" Apr 24 21:36:41.059985 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.059964 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:36:41.060089 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.059968 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:36:41.060089 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.060028 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hlqpc\"" Apr 24 21:36:41.061030 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.060892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:36:41.067012 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.066987 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nk48p"] Apr 24 21:36:41.116096 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.116067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pkw\" (UniqueName: \"kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw\") pod \"s3-init-nk48p\" (UID: \"86417de4-f634-4847-b71e-359bc85539ec\") " pod="kserve/s3-init-nk48p" Apr 24 21:36:41.216503 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.216461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pkw\" (UniqueName: \"kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw\") pod \"s3-init-nk48p\" (UID: \"86417de4-f634-4847-b71e-359bc85539ec\") " pod="kserve/s3-init-nk48p" Apr 24 21:36:41.226125 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.226098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pkw\" (UniqueName: \"kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw\") pod \"s3-init-nk48p\" (UID: \"86417de4-f634-4847-b71e-359bc85539ec\") " pod="kserve/s3-init-nk48p" Apr 24 21:36:41.363387 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.363281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nk48p" Apr 24 21:36:41.538112 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.538089 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-nk48p"] Apr 24 21:36:41.540759 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:36:41.540727 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86417de4_f634_4847_b71e_359bc85539ec.slice/crio-c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720 WatchSource:0}: Error finding container c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720: Status 404 returned error can't find the container with id c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720 Apr 24 21:36:41.542482 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.542463 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:36:41.983972 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:41.983919 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nk48p" event={"ID":"86417de4-f634-4847-b71e-359bc85539ec","Type":"ContainerStarted","Data":"c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720"} Apr 24 21:36:45.997262 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:45.997223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nk48p" event={"ID":"86417de4-f634-4847-b71e-359bc85539ec","Type":"ContainerStarted","Data":"521e193676d014c2abd7a63ffb0deb7c97e169c7b9a375f7a9aaa40a30a9e52d"} Apr 24 21:36:46.013829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:46.013775 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-nk48p" podStartSLOduration=0.6375231 podStartE2EDuration="5.013758458s" podCreationTimestamp="2026-04-24 21:36:41 +0000 UTC" firstStartedPulling="2026-04-24 21:36:41.542638161 +0000 UTC m=+608.889544491" lastFinishedPulling="2026-04-24 21:36:45.918873517 +0000 UTC m=+613.265779849" observedRunningTime="2026-04-24 21:36:46.012309074 +0000 UTC m=+613.359215420" watchObservedRunningTime="2026-04-24 21:36:46.013758458 +0000 UTC m=+613.360664804" Apr 24 21:36:50.010510 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:50.010472 2574 generic.go:358] "Generic (PLEG): container finished" podID="86417de4-f634-4847-b71e-359bc85539ec" containerID="521e193676d014c2abd7a63ffb0deb7c97e169c7b9a375f7a9aaa40a30a9e52d" exitCode=0 Apr 24 21:36:50.010893 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:50.010552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nk48p" event={"ID":"86417de4-f634-4847-b71e-359bc85539ec","Type":"ContainerDied","Data":"521e193676d014c2abd7a63ffb0deb7c97e169c7b9a375f7a9aaa40a30a9e52d"} Apr 24 21:36:51.131310 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:51.131282 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nk48p" Apr 24 21:36:51.203546 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:51.203510 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8pkw\" (UniqueName: \"kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw\") pod \"86417de4-f634-4847-b71e-359bc85539ec\" (UID: \"86417de4-f634-4847-b71e-359bc85539ec\") " Apr 24 21:36:51.205766 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:51.205734 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw" (OuterVolumeSpecName: "kube-api-access-q8pkw") pod "86417de4-f634-4847-b71e-359bc85539ec" (UID: "86417de4-f634-4847-b71e-359bc85539ec"). InnerVolumeSpecName "kube-api-access-q8pkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:51.304995 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:51.304904 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8pkw\" (UniqueName: \"kubernetes.io/projected/86417de4-f634-4847-b71e-359bc85539ec-kube-api-access-q8pkw\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:36:52.017282 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:52.017255 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-nk48p" Apr 24 21:36:52.017282 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:52.017273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-nk48p" event={"ID":"86417de4-f634-4847-b71e-359bc85539ec","Type":"ContainerDied","Data":"c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720"} Apr 24 21:36:52.017524 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:36:52.017300 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1939ff23a563386ca1548819148ca40a19cd21c08a705a68ee866e05eb1a720" Apr 24 21:37:01.390926 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.390889 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:37:01.391301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.391193 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86417de4-f634-4847-b71e-359bc85539ec" containerName="s3-init" Apr 24 21:37:01.391301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.391204 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="86417de4-f634-4847-b71e-359bc85539ec" containerName="s3-init" Apr 24 21:37:01.391301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.391280 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="86417de4-f634-4847-b71e-359bc85539ec" containerName="s3-init" Apr 24 21:37:01.393374 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.393356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.395516 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.395480 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6ztvm\"" Apr 24 21:37:01.395643 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.395532 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:01.396280 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.396255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 24 21:37:01.396400 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.396321 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 24 21:37:01.396400 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.396367 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:37:01.403480 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.403457 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:37:01.498254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.498220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.498254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.498263 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.498523 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.498352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.498523 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.498431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgcj7\" (UniqueName: \"kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.599706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.599663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.599706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.599705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.599919 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.599836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.599919 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.599907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgcj7\" (UniqueName: \"kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.600264 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.600244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.600552 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.600532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.602133 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.602111 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.609662 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.609641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgcj7\" (UniqueName: \"kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7\") pod \"isvc-xgboost-graph-predictor-669d8d6456-6z8g6\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.704241 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.704204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:01.830461 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:01.830432 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:37:01.833070 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:37:01.833041 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ab7a10_0005_4d6c_a55d_b1d39e63c048.slice/crio-2f0366cb7bfce5239d821b28944efd376377745568fa250af247a9c47fa2feda WatchSource:0}: Error finding container 2f0366cb7bfce5239d821b28944efd376377745568fa250af247a9c47fa2feda: Status 404 returned error can't find the container with id 2f0366cb7bfce5239d821b28944efd376377745568fa250af247a9c47fa2feda Apr 24 21:37:02.055213 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:02.055132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerStarted","Data":"2f0366cb7bfce5239d821b28944efd376377745568fa250af247a9c47fa2feda"} Apr 24 21:37:05.065658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:05.065619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerStarted","Data":"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6"} Apr 24 21:37:09.079083 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:09.079048 2574 generic.go:358] "Generic (PLEG): container finished" podID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerID="22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6" exitCode=0 Apr 24 21:37:09.079520 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:09.079120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerDied","Data":"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6"} Apr 24 21:37:27.138911 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:27.138876 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerStarted","Data":"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a"} Apr 24 21:37:30.151864 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:30.151824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerStarted","Data":"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad"} Apr 24 21:37:30.152245 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:30.151948 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:30.174886 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:30.174832 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podStartSLOduration=1.843463503 podStartE2EDuration="29.174814602s" podCreationTimestamp="2026-04-24 21:37:01 +0000 UTC" firstStartedPulling="2026-04-24 21:37:01.835011757 +0000 UTC m=+629.181918082" lastFinishedPulling="2026-04-24 21:37:29.166362858 +0000 UTC m=+656.513269181" observedRunningTime="2026-04-24 21:37:30.17282451 +0000 UTC m=+657.519730849" watchObservedRunningTime="2026-04-24 21:37:30.174814602 +0000 UTC m=+657.521720951" Apr 24 21:37:31.155277 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:31.155245 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:31.156574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:31.156524 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:37:32.158401 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:32.158361 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:37:37.162938 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:37.162908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:37:37.163460 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:37.163434 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:37:47.163621 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:47.163579 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:37:57.163677 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:37:57.163637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:38:07.164163 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:07.164122 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:38:17.163600 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:17.163517 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:38:20.897840 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.897803 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:38:20.901227 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.901211 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:20.903539 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.903505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-914c2-serving-cert\"" Apr 24 21:38:20.903641 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.903626 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-914c2-kube-rbac-proxy-sar-config\"" Apr 24 21:38:20.911456 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.911426 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:38:20.984857 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.984824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:20.985029 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:20.984886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.085564 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.085518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.085747 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.085596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.086279 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.086255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.087970 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.087946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls\") pod \"switch-graph-914c2-55b54cc8f8-kv6sm\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.211536 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.211479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:21.336263 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:21.336240 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:38:21.338712 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:38:21.338672 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4 WatchSource:0}: Error finding container c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4: Status 404 returned error can't find the container with id c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4 Apr 24 21:38:22.302245 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:22.302204 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" event={"ID":"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c","Type":"ContainerStarted","Data":"c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4"} Apr 24 21:38:23.306095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:23.306061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" event={"ID":"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c","Type":"ContainerStarted","Data":"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582"} Apr 24 21:38:23.306587 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:23.306123 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:23.324461 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:23.324362 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podStartSLOduration=1.5116990989999999 podStartE2EDuration="3.324320241s" podCreationTimestamp="2026-04-24 21:38:20 +0000 UTC" firstStartedPulling="2026-04-24 21:38:21.340457915 +0000 UTC m=+708.687364239" lastFinishedPulling="2026-04-24 21:38:23.153079044 +0000 UTC m=+710.499985381" observedRunningTime="2026-04-24 21:38:23.323933187 +0000 UTC m=+710.670839534" watchObservedRunningTime="2026-04-24 21:38:23.324320241 +0000 UTC m=+710.671226588" Apr 24 21:38:27.163418 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:27.163374 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 21:38:29.314233 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:29.314206 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:31.061910 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:31.061876 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:38:31.062310 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:31.062092 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" containerID="cri-o://497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582" gracePeriod=30 Apr 24 21:38:34.313730 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:34.313684 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:37.164500 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:37.164470 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:38:39.312989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:39.312943 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:44.313494 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:44.313447 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:44.313981 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:44.313557 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:38:49.313001 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:49.312960 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:54.312944 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:54.312899 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:38:59.312662 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:38:59.312615 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:00.867202 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:00.867164 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:00.872246 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:00.872221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:00.874540 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:00.874517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 21:39:00.874781 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:00.874765 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 21:39:00.878057 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:00.878034 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:01.028638 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.028599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.028828 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.028713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.122622 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.122590 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-conmon-497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:39:01.122622 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.122611 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4\": RecentStats: unable to find data in memory cache]" Apr 24 21:39:01.122787 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.122659 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2864ba5_2503_4c9d_b4a6_da3a6e5c896c.slice/crio-conmon-497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:39:01.129234 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.129205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.129399 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.129262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.129462 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.129414 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 24 21:39:01.129524 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.129471 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls podName:21fcd48e-105c-4d7a-be45-51cfdea3fb4d nodeName:}" failed. No retries permitted until 2026-04-24 21:39:01.629456127 +0000 UTC m=+748.976362451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls") pod "model-chainer-64c4bcbb69-xgd8t" (UID: "21fcd48e-105c-4d7a-be45-51cfdea3fb4d") : secret "model-chainer-serving-cert" not found Apr 24 21:39:01.129816 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.129797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.242966 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.242944 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:39:01.420323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.420233 2574 generic.go:358] "Generic (PLEG): container finished" podID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerID="497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582" exitCode=0 Apr 24 21:39:01.420323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.420293 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" Apr 24 21:39:01.420323 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.420319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" event={"ID":"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c","Type":"ContainerDied","Data":"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582"} Apr 24 21:39:01.420577 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.420365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm" event={"ID":"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c","Type":"ContainerDied","Data":"c7e91167b22dedfc32054f4ac25b2078cfd77558559fbed6032ac06e978e1da4"} Apr 24 21:39:01.420577 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.420381 2574 scope.go:117] "RemoveContainer" containerID="497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582" Apr 24 21:39:01.427955 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.427936 2574 scope.go:117] "RemoveContainer" containerID="497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582" Apr 24 21:39:01.428219 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:01.428201 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582\": container with ID starting with 497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582 not found: ID does not exist" containerID="497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582" Apr 24 21:39:01.428264 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.428227 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582"} err="failed to get container status \"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582\": rpc error: code = NotFound desc = could not find container \"497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582\": container with ID starting with 497a176ba0bdd6eef4ab00df8c4601f99a75517ae0fc3231e228f86b8812f582 not found: ID does not exist" Apr 24 21:39:01.432502 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.432487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls\") pod \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " Apr 24 21:39:01.432588 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.432576 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle\") pod \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\" (UID: \"b2864ba5-2503-4c9d-b4a6-da3a6e5c896c\") " Apr 24 21:39:01.432875 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.432857 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" (UID: "b2864ba5-2503-4c9d-b4a6-da3a6e5c896c"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:01.434610 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.434588 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" (UID: "b2864ba5-2503-4c9d-b4a6-da3a6e5c896c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:01.533511 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.533473 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:01.533511 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.533505 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:01.633876 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.633842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.636249 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.636220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") pod \"model-chainer-64c4bcbb69-xgd8t\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.747703 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.747674 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:39:01.756469 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.756442 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-914c2-55b54cc8f8-kv6sm"] Apr 24 21:39:01.783498 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.783472 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:01.904480 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:01.904442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:01.907793 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:39:01.907767 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21fcd48e_105c_4d7a_be45_51cfdea3fb4d.slice/crio-85c55d2975223bf62d435aa05e01bac778fff2a387776f12ebaa23405576e7f4 WatchSource:0}: Error finding container 85c55d2975223bf62d435aa05e01bac778fff2a387776f12ebaa23405576e7f4: Status 404 returned error can't find the container with id 85c55d2975223bf62d435aa05e01bac778fff2a387776f12ebaa23405576e7f4 Apr 24 21:39:02.425574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:02.425541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" event={"ID":"21fcd48e-105c-4d7a-be45-51cfdea3fb4d","Type":"ContainerStarted","Data":"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee"} Apr 24 21:39:02.425574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:02.425577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" event={"ID":"21fcd48e-105c-4d7a-be45-51cfdea3fb4d","Type":"ContainerStarted","Data":"85c55d2975223bf62d435aa05e01bac778fff2a387776f12ebaa23405576e7f4"} Apr 24 21:39:02.425827 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:02.425602 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:02.443484 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:02.443436 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podStartSLOduration=2.4434207580000002 podStartE2EDuration="2.443420758s" podCreationTimestamp="2026-04-24 21:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:02.441682693 +0000 UTC m=+749.788589040" watchObservedRunningTime="2026-04-24 21:39:02.443420758 +0000 UTC m=+749.790327464" Apr 24 21:39:03.151816 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:03.151782 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" path="/var/lib/kubelet/pods/b2864ba5-2503-4c9d-b4a6-da3a6e5c896c/volumes" Apr 24 21:39:08.434393 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:08.434362 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:10.968858 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:10.968820 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:10.969225 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:10.969037 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" containerID="cri-o://8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee" gracePeriod=30 Apr 24 21:39:11.234634 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:11.234551 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:39:11.235052 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:11.234990 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" containerID="cri-o://e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a" gracePeriod=30 Apr 24 21:39:11.235181 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:11.235046 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kube-rbac-proxy" containerID="cri-o://ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad" gracePeriod=30 Apr 24 21:39:11.453121 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:11.453087 2574 generic.go:358] "Generic (PLEG): container finished" podID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerID="ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad" exitCode=2 Apr 24 21:39:11.453278 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:11.453161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerDied","Data":"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad"} Apr 24 21:39:12.159263 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:12.159221 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 21:39:13.432131 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:13.432087 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:14.966369 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:14.966321 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:39:15.029536 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029497 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location\") pod \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " Apr 24 21:39:15.029711 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029546 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgcj7\" (UniqueName: \"kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7\") pod \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " Apr 24 21:39:15.029711 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029587 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " Apr 24 21:39:15.029711 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029618 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls\") pod \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\" (UID: \"a1ab7a10-0005-4d6c-a55d-b1d39e63c048\") " Apr 24 21:39:15.029880 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029825 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1ab7a10-0005-4d6c-a55d-b1d39e63c048" (UID: "a1ab7a10-0005-4d6c-a55d-b1d39e63c048"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:39:15.029951 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.029929 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "a1ab7a10-0005-4d6c-a55d-b1d39e63c048" (UID: "a1ab7a10-0005-4d6c-a55d-b1d39e63c048"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:15.031655 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.031635 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7" (OuterVolumeSpecName: "kube-api-access-rgcj7") pod "a1ab7a10-0005-4d6c-a55d-b1d39e63c048" (UID: "a1ab7a10-0005-4d6c-a55d-b1d39e63c048"). InnerVolumeSpecName "kube-api-access-rgcj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:15.031845 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.031820 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a1ab7a10-0005-4d6c-a55d-b1d39e63c048" (UID: "a1ab7a10-0005-4d6c-a55d-b1d39e63c048"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:15.130658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.130543 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.130658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.130591 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.130658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.130608 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kserve-provision-location\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.130658 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.130620 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rgcj7\" (UniqueName: \"kubernetes.io/projected/a1ab7a10-0005-4d6c-a55d-b1d39e63c048-kube-api-access-rgcj7\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:15.466223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.466186 2574 generic.go:358] "Generic (PLEG): container finished" podID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerID="e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a" exitCode=0 Apr 24 21:39:15.466440 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.466265 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" Apr 24 21:39:15.466440 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.466268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerDied","Data":"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a"} Apr 24 21:39:15.466440 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.466310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6" event={"ID":"a1ab7a10-0005-4d6c-a55d-b1d39e63c048","Type":"ContainerDied","Data":"2f0366cb7bfce5239d821b28944efd376377745568fa250af247a9c47fa2feda"} Apr 24 21:39:15.466440 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.466343 2574 scope.go:117] "RemoveContainer" containerID="ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad" Apr 24 21:39:15.474340 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.474299 2574 scope.go:117] "RemoveContainer" containerID="e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a" Apr 24 21:39:15.482317 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.482293 2574 scope.go:117] "RemoveContainer" containerID="22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6" Apr 24 21:39:15.483950 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.483926 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:39:15.487059 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.487037 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-6z8g6"] Apr 24 21:39:15.493024 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.492992 2574 scope.go:117] "RemoveContainer" containerID="ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad" Apr 24 21:39:15.493543 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:15.493516 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad\": container with ID starting with ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad not found: ID does not exist" containerID="ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad" Apr 24 21:39:15.493612 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.493553 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad"} err="failed to get container status \"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad\": rpc error: code = NotFound desc = could not find container \"ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad\": container with ID starting with ea98d4eda03d71a0fb782652dd62550d71b8b14f1d87a3ab04eaca38309dccad not found: ID does not exist" Apr 24 21:39:15.493612 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.493572 2574 scope.go:117] "RemoveContainer" containerID="e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a" Apr 24 21:39:15.493859 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:15.493840 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a\": container with ID starting with e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a not found: ID does not exist" containerID="e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a" Apr 24 21:39:15.493911 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.493864 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a"} err="failed to get container status \"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a\": rpc error: code = NotFound desc = could not find container \"e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a\": container with ID starting with e5ad4c039bed683ef857039d8472f6ecbd8401c388d8a13c2233351f9b70357a not found: ID does not exist" Apr 24 21:39:15.493911 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.493882 2574 scope.go:117] "RemoveContainer" containerID="22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6" Apr 24 21:39:15.494130 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:15.494113 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6\": container with ID starting with 22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6 not found: ID does not exist" containerID="22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6" Apr 24 21:39:15.494169 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:15.494135 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6"} err="failed to get container status \"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6\": rpc error: code = NotFound desc = could not find container \"22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6\": container with ID starting with 22ae9ac3152f0e18a2e65289108dcea1d4e4a1bad85c776c0e5286b7770924f6 not found: ID does not exist" Apr 24 21:39:17.147170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:17.147132 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" path="/var/lib/kubelet/pods/a1ab7a10-0005-4d6c-a55d-b1d39e63c048/volumes" Apr 24 21:39:18.432875 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:18.432829 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:23.433007 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:23.432964 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:23.433428 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:23.433091 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:28.432442 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:28.432397 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:31.309058 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.308991 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309518 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309539 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309554 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="storage-initializer" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309562 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="storage-initializer" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309589 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kube-rbac-proxy" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309597 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kube-rbac-proxy" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309605 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" Apr 24 21:39:31.309614 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309613 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" Apr 24 21:39:31.309989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309694 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kube-rbac-proxy" Apr 24 21:39:31.309989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309706 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2864ba5-2503-4c9d-b4a6-da3a6e5c896c" containerName="switch-graph-914c2" Apr 24 21:39:31.309989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.309717 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1ab7a10-0005-4d6c-a55d-b1d39e63c048" containerName="kserve-container" Apr 24 21:39:31.314298 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.314274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.316220 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.316196 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:39:31.317580 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.317557 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-c12fa-serving-cert\"" Apr 24 21:39:31.317714 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.317585 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-c12fa-kube-rbac-proxy-sar-config\"" Apr 24 21:39:31.350768 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.350722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.350936 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.350866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.451833 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.451792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.452016 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.451876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.452537 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.452509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.454342 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.454306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls\") pod \"switch-graph-c12fa-594b845bff-sdhrf\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.625668 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.625577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:31.760764 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:31.760727 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:39:31.765297 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:39:31.765267 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b83589_a966_4c69_b88a_88989d6eed00.slice/crio-70205d59a36c6edf93144e85e69e43036602618f6ed2c93b90680492ec0d3f25 WatchSource:0}: Error finding container 70205d59a36c6edf93144e85e69e43036602618f6ed2c93b90680492ec0d3f25: Status 404 returned error can't find the container with id 70205d59a36c6edf93144e85e69e43036602618f6ed2c93b90680492ec0d3f25 Apr 24 21:39:32.517829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:32.517789 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" event={"ID":"b4b83589-a966-4c69-b88a-88989d6eed00","Type":"ContainerStarted","Data":"a1157882ddfa2d7478c28d2ff958b4d73a5e7f6a2797a11f6be5dbf5af7f53ce"} Apr 24 21:39:32.517829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:32.517827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" event={"ID":"b4b83589-a966-4c69-b88a-88989d6eed00","Type":"ContainerStarted","Data":"70205d59a36c6edf93144e85e69e43036602618f6ed2c93b90680492ec0d3f25"} Apr 24 21:39:32.518401 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:32.517946 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:32.533444 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:32.533387 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podStartSLOduration=1.533370314 podStartE2EDuration="1.533370314s" podCreationTimestamp="2026-04-24 21:39:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:39:32.533302567 +0000 UTC m=+779.880208913" watchObservedRunningTime="2026-04-24 21:39:32.533370314 +0000 UTC m=+779.880276661" Apr 24 21:39:33.432439 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:33.432348 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:38.432595 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:38.432549 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:39:38.527298 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:38.527261 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:39:41.011272 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:41.011237 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21fcd48e_105c_4d7a_be45_51cfdea3fb4d.slice/crio-conmon-8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:39:41.110569 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.110542 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:41.233017 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.232917 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") pod \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " Apr 24 21:39:41.233017 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.232960 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle\") pod \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\" (UID: \"21fcd48e-105c-4d7a-be45-51cfdea3fb4d\") " Apr 24 21:39:41.233470 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.233443 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "21fcd48e-105c-4d7a-be45-51cfdea3fb4d" (UID: "21fcd48e-105c-4d7a-be45-51cfdea3fb4d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:41.235089 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.235068 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "21fcd48e-105c-4d7a-be45-51cfdea3fb4d" (UID: "21fcd48e-105c-4d7a-be45-51cfdea3fb4d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:41.334354 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.334299 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:41.334528 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.334373 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21fcd48e-105c-4d7a-be45-51cfdea3fb4d-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:39:41.547414 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.547291 2574 generic.go:358] "Generic (PLEG): container finished" podID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerID="8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee" exitCode=0 Apr 24 21:39:41.547414 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.547368 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" Apr 24 21:39:41.547615 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.547367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" event={"ID":"21fcd48e-105c-4d7a-be45-51cfdea3fb4d","Type":"ContainerDied","Data":"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee"} Apr 24 21:39:41.547615 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.547463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t" event={"ID":"21fcd48e-105c-4d7a-be45-51cfdea3fb4d","Type":"ContainerDied","Data":"85c55d2975223bf62d435aa05e01bac778fff2a387776f12ebaa23405576e7f4"} Apr 24 21:39:41.547615 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.547478 2574 scope.go:117] "RemoveContainer" containerID="8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee" Apr 24 21:39:41.556187 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.556167 2574 scope.go:117] "RemoveContainer" containerID="8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee" Apr 24 21:39:41.556528 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:39:41.556504 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee\": container with ID starting with 8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee not found: ID does not exist" containerID="8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee" Apr 24 21:39:41.556631 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.556537 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee"} err="failed to get container status \"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee\": rpc error: code = NotFound desc = could not find container \"8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee\": container with ID starting with 8fbe0c4e0fdca0335c95d30a2148cfe228c09811d657d861ff09f142a38e11ee not found: ID does not exist" Apr 24 21:39:41.570790 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.570757 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:41.574703 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:41.574676 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-64c4bcbb69-xgd8t"] Apr 24 21:39:43.153554 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:39:43.153508 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" path="/var/lib/kubelet/pods/21fcd48e-105c-4d7a-be45-51cfdea3fb4d/volumes" Apr 24 21:40:11.216595 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.216558 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:40:11.217078 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.216898 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" Apr 24 21:40:11.217078 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.216910 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" Apr 24 21:40:11.217078 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.216973 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="21fcd48e-105c-4d7a-be45-51cfdea3fb4d" containerName="model-chainer" Apr 24 21:40:11.219925 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.219903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:11.222169 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.222142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7fc99-serving-cert\"" Apr 24 21:40:11.222310 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.222189 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-7fc99-kube-rbac-proxy-sar-config\"" Apr 24 21:40:11.235075 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.235045 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:40:11.399132 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.399092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:11.399363 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.399152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:11.500060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.499968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:11.500060 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.500020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:11.500258 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:40:11.500125 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-7fc99-serving-cert: secret "sequence-graph-7fc99-serving-cert" not found Apr 24 21:40:11.500258 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:40:11.500197 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls podName:61786274-3c1c-4909-85cc-dc763bba1fe1 nodeName:}" failed. No retries permitted until 2026-04-24 21:40:12.000176845 +0000 UTC m=+819.347083170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls") pod "sequence-graph-7fc99-59cc895d65-jhvth" (UID: "61786274-3c1c-4909-85cc-dc763bba1fe1") : secret "sequence-graph-7fc99-serving-cert" not found Apr 24 21:40:11.500644 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:11.500627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:12.004581 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.004545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:12.006960 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.006928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") pod \"sequence-graph-7fc99-59cc895d65-jhvth\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:12.130390 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.130347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:12.252477 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.252442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:40:12.256164 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:40:12.256091 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61786274_3c1c_4909_85cc_dc763bba1fe1.slice/crio-3580cf34ba2131482509dd2cd81cb1ed11fd6cfc9867ca54214ff761a83c0766 WatchSource:0}: Error finding container 3580cf34ba2131482509dd2cd81cb1ed11fd6cfc9867ca54214ff761a83c0766: Status 404 returned error can't find the container with id 3580cf34ba2131482509dd2cd81cb1ed11fd6cfc9867ca54214ff761a83c0766 Apr 24 21:40:12.643167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.643064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" event={"ID":"61786274-3c1c-4909-85cc-dc763bba1fe1","Type":"ContainerStarted","Data":"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc"} Apr 24 21:40:12.643167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.643103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" event={"ID":"61786274-3c1c-4909-85cc-dc763bba1fe1","Type":"ContainerStarted","Data":"3580cf34ba2131482509dd2cd81cb1ed11fd6cfc9867ca54214ff761a83c0766"} Apr 24 21:40:12.643167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.643126 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:40:12.660289 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:12.660234 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podStartSLOduration=1.660219706 podStartE2EDuration="1.660219706s" podCreationTimestamp="2026-04-24 21:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:40:12.659227841 +0000 UTC m=+820.006134189" watchObservedRunningTime="2026-04-24 21:40:12.660219706 +0000 UTC m=+820.007126051" Apr 24 21:40:18.652435 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:40:18.652404 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:41:33.137785 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:41:33.137757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:41:33.140140 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:41:33.140116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:41:33.143320 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:41:33.143293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:41:33.146261 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:41:33.146235 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:46:33.159835 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:46:33.159796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:46:33.163277 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:46:33.163256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:46:33.167900 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:46:33.167875 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:46:33.171574 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:46:33.171555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:47:45.978092 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:45.978000 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:47:45.978661 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:45.978283 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" containerID="cri-o://a1157882ddfa2d7478c28d2ff958b4d73a5e7f6a2797a11f6be5dbf5af7f53ce" gracePeriod=30 Apr 24 21:47:48.525903 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:48.525861 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:53.526194 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:53.526146 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:58.525462 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:58.525423 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:58.525854 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:47:58.525530 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:48:03.525545 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:03.525505 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:08.526164 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:08.526122 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:13.526223 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:13.526183 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:16.083671 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.083635 2574 generic.go:358] "Generic (PLEG): container finished" podID="b4b83589-a966-4c69-b88a-88989d6eed00" containerID="a1157882ddfa2d7478c28d2ff958b4d73a5e7f6a2797a11f6be5dbf5af7f53ce" exitCode=0 Apr 24 21:48:16.084063 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.083722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" event={"ID":"b4b83589-a966-4c69-b88a-88989d6eed00","Type":"ContainerDied","Data":"a1157882ddfa2d7478c28d2ff958b4d73a5e7f6a2797a11f6be5dbf5af7f53ce"} Apr 24 21:48:16.133497 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.133473 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:48:16.217520 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.217481 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls\") pod \"b4b83589-a966-4c69-b88a-88989d6eed00\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " Apr 24 21:48:16.217724 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.217584 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle\") pod \"b4b83589-a966-4c69-b88a-88989d6eed00\" (UID: \"b4b83589-a966-4c69-b88a-88989d6eed00\") " Apr 24 21:48:16.217965 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.217944 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b4b83589-a966-4c69-b88a-88989d6eed00" (UID: "b4b83589-a966-4c69-b88a-88989d6eed00"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:16.219715 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.219690 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4b83589-a966-4c69-b88a-88989d6eed00" (UID: "b4b83589-a966-4c69-b88a-88989d6eed00"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:16.319219 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.319099 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4b83589-a966-4c69-b88a-88989d6eed00-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:48:16.319219 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:16.319152 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b83589-a966-4c69-b88a-88989d6eed00-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:48:17.088706 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.088669 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" event={"ID":"b4b83589-a966-4c69-b88a-88989d6eed00","Type":"ContainerDied","Data":"70205d59a36c6edf93144e85e69e43036602618f6ed2c93b90680492ec0d3f25"} Apr 24 21:48:17.089145 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.088723 2574 scope.go:117] "RemoveContainer" containerID="a1157882ddfa2d7478c28d2ff958b4d73a5e7f6a2797a11f6be5dbf5af7f53ce" Apr 24 21:48:17.089145 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.088729 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf" Apr 24 21:48:17.116103 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.116065 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:48:17.117388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.117361 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-c12fa-594b845bff-sdhrf"] Apr 24 21:48:17.147624 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:17.147593 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" path="/var/lib/kubelet/pods/b4b83589-a966-4c69-b88a-88989d6eed00/volumes" Apr 24 21:48:25.845572 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:25.845534 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:48:25.846080 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:25.845766 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" containerID="cri-o://02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc" gracePeriod=30 Apr 24 21:48:28.650478 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:28.650425 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:33.650063 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:33.649965 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:38.650665 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:38.650619 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:38.651058 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:38.650788 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:48:43.649778 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:43.649683 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:46.196989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.196954 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:48:46.197424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.197293 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" Apr 24 21:48:46.197424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.197304 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" Apr 24 21:48:46.197424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.197373 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4b83589-a966-4c69-b88a-88989d6eed00" containerName="switch-graph-c12fa" Apr 24 21:48:46.201577 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.201557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.203906 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.203876 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2525b-kube-rbac-proxy-sar-config\"" Apr 24 21:48:46.203906 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.203876 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-2525b-serving-cert\"" Apr 24 21:48:46.217512 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.217482 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:48:46.270095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.270058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.270095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.270101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.371418 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.371378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.371590 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.371426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.372156 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.372122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.373833 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.373810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls\") pod \"ensemble-graph-2525b-8997d599-kctgl\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.512205 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.512112 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:46.636004 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.635970 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:48:46.639113 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:48:46.639080 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadad426_f0be_479b_9a3b_c0b06acec4c4.slice/crio-1dd5b7b03dba41e59d6508cf9c7f889926ad9a9e47326721657d234c9e572db9 WatchSource:0}: Error finding container 1dd5b7b03dba41e59d6508cf9c7f889926ad9a9e47326721657d234c9e572db9: Status 404 returned error can't find the container with id 1dd5b7b03dba41e59d6508cf9c7f889926ad9a9e47326721657d234c9e572db9 Apr 24 21:48:46.641281 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:46.641263 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:48:47.192849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:47.192814 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" event={"ID":"aadad426-f0be-479b-9a3b-c0b06acec4c4","Type":"ContainerStarted","Data":"11d5827fc293262171bd6b71d7c763772efc9d79b2a1a5be62269d6dde44f45e"} Apr 24 21:48:47.192849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:47.192851 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" event={"ID":"aadad426-f0be-479b-9a3b-c0b06acec4c4","Type":"ContainerStarted","Data":"1dd5b7b03dba41e59d6508cf9c7f889926ad9a9e47326721657d234c9e572db9"} Apr 24 21:48:47.193087 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:47.192878 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:47.212360 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:47.212289 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podStartSLOduration=1.212272781 podStartE2EDuration="1.212272781s" podCreationTimestamp="2026-04-24 21:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:47.210405708 +0000 UTC m=+1334.557312055" watchObservedRunningTime="2026-04-24 21:48:47.212272781 +0000 UTC m=+1334.559179127" Apr 24 21:48:48.649745 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:48.649701 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:53.202117 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:53.202080 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:48:53.649637 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:53.649536 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:48:55.983150 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:55.983121 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:48:56.054625 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.054584 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") pod \"61786274-3c1c-4909-85cc-dc763bba1fe1\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " Apr 24 21:48:56.054625 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.054627 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle\") pod \"61786274-3c1c-4909-85cc-dc763bba1fe1\" (UID: \"61786274-3c1c-4909-85cc-dc763bba1fe1\") " Apr 24 21:48:56.055057 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.055021 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "61786274-3c1c-4909-85cc-dc763bba1fe1" (UID: "61786274-3c1c-4909-85cc-dc763bba1fe1"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:48:56.056807 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.056777 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61786274-3c1c-4909-85cc-dc763bba1fe1" (UID: "61786274-3c1c-4909-85cc-dc763bba1fe1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:56.156137 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.156040 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61786274-3c1c-4909-85cc-dc763bba1fe1-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:48:56.156137 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.156078 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61786274-3c1c-4909-85cc-dc763bba1fe1-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:48:56.219363 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.219304 2574 generic.go:358] "Generic (PLEG): container finished" podID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerID="02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc" exitCode=0 Apr 24 21:48:56.219563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.219370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" event={"ID":"61786274-3c1c-4909-85cc-dc763bba1fe1","Type":"ContainerDied","Data":"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc"} Apr 24 21:48:56.219563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.219413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" event={"ID":"61786274-3c1c-4909-85cc-dc763bba1fe1","Type":"ContainerDied","Data":"3580cf34ba2131482509dd2cd81cb1ed11fd6cfc9867ca54214ff761a83c0766"} Apr 24 21:48:56.219563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.219423 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth" Apr 24 21:48:56.219563 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.219428 2574 scope.go:117] "RemoveContainer" containerID="02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc" Apr 24 21:48:56.229428 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.229304 2574 scope.go:117] "RemoveContainer" containerID="02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc" Apr 24 21:48:56.230048 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:48:56.230025 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc\": container with ID starting with 02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc not found: ID does not exist" containerID="02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc" Apr 24 21:48:56.230155 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.230056 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc"} err="failed to get container status \"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc\": rpc error: code = NotFound desc = could not find container \"02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc\": container with ID starting with 02507a67e6394d37b0bc40ab94241addefa7a1582a70c50725951c257f1190fc not found: ID does not exist" Apr 24 21:48:56.242542 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.242507 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:48:56.244938 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.244906 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-7fc99-59cc895d65-jhvth"] Apr 24 21:48:56.265806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.265771 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:48:56.266048 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:56.266001 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" containerID="cri-o://11d5827fc293262171bd6b71d7c763772efc9d79b2a1a5be62269d6dde44f45e" gracePeriod=30 Apr 24 21:48:57.146892 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:57.146856 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" path="/var/lib/kubelet/pods/61786274-3c1c-4909-85cc-dc763bba1fe1/volumes" Apr 24 21:48:58.199972 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:48:58.199934 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:03.200545 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:03.200502 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:08.200216 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:08.200174 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:08.200673 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:08.200285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:49:13.200455 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:13.200411 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:18.200459 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:18.200405 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:23.200472 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:23.200426 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:26.029957 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.029919 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:49:26.030448 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.030226 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" Apr 24 21:49:26.030448 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.030238 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" Apr 24 21:49:26.030448 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.030305 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="61786274-3c1c-4909-85cc-dc763bba1fe1" containerName="sequence-graph-7fc99" Apr 24 21:49:26.033198 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.033171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.035473 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.035449 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b9878-kube-rbac-proxy-sar-config\"" Apr 24 21:49:26.035609 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.035505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-b9878-serving-cert\"" Apr 24 21:49:26.044322 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.044297 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:49:26.106178 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.106131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.106386 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.106193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.207251 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.207216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.207450 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.207280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.207914 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.207890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.209861 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.209835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls\") pod \"sequence-graph-b9878-79b8776d89-hngl7\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.309448 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.309409 2574 generic.go:358] "Generic (PLEG): container finished" podID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerID="11d5827fc293262171bd6b71d7c763772efc9d79b2a1a5be62269d6dde44f45e" exitCode=0 Apr 24 21:49:26.309617 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.309482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" event={"ID":"aadad426-f0be-479b-9a3b-c0b06acec4c4","Type":"ContainerDied","Data":"11d5827fc293262171bd6b71d7c763772efc9d79b2a1a5be62269d6dde44f45e"} Apr 24 21:49:26.344083 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.344045 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:26.406806 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.406783 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:49:26.481442 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.481405 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:49:26.484398 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:49:26.484370 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d06567_f7c3_4722_a71e_23e7c86ee03f.slice/crio-04254092e9910ce0bc4c78e8041498d5f45451155034d0d8b4a6de7957df609f WatchSource:0}: Error finding container 04254092e9910ce0bc4c78e8041498d5f45451155034d0d8b4a6de7957df609f: Status 404 returned error can't find the container with id 04254092e9910ce0bc4c78e8041498d5f45451155034d0d8b4a6de7957df609f Apr 24 21:49:26.510456 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.510426 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle\") pod \"aadad426-f0be-479b-9a3b-c0b06acec4c4\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " Apr 24 21:49:26.510615 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.510481 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls\") pod \"aadad426-f0be-479b-9a3b-c0b06acec4c4\" (UID: \"aadad426-f0be-479b-9a3b-c0b06acec4c4\") " Apr 24 21:49:26.510785 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.510762 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "aadad426-f0be-479b-9a3b-c0b06acec4c4" (UID: "aadad426-f0be-479b-9a3b-c0b06acec4c4"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:26.512561 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.512539 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "aadad426-f0be-479b-9a3b-c0b06acec4c4" (UID: "aadad426-f0be-479b-9a3b-c0b06acec4c4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:26.611754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.611715 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadad426-f0be-479b-9a3b-c0b06acec4c4-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:49:26.611754 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:26.611745 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aadad426-f0be-479b-9a3b-c0b06acec4c4-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:49:27.313301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.313270 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" Apr 24 21:49:27.313769 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.313272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl" event={"ID":"aadad426-f0be-479b-9a3b-c0b06acec4c4","Type":"ContainerDied","Data":"1dd5b7b03dba41e59d6508cf9c7f889926ad9a9e47326721657d234c9e572db9"} Apr 24 21:49:27.313769 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.313400 2574 scope.go:117] "RemoveContainer" containerID="11d5827fc293262171bd6b71d7c763772efc9d79b2a1a5be62269d6dde44f45e" Apr 24 21:49:27.314707 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.314683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" event={"ID":"95d06567-f7c3-4722-a71e-23e7c86ee03f","Type":"ContainerStarted","Data":"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de"} Apr 24 21:49:27.314786 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.314719 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" event={"ID":"95d06567-f7c3-4722-a71e-23e7c86ee03f","Type":"ContainerStarted","Data":"04254092e9910ce0bc4c78e8041498d5f45451155034d0d8b4a6de7957df609f"} Apr 24 21:49:27.314887 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.314871 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:27.329681 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.329649 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:49:27.333546 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.333518 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-2525b-8997d599-kctgl"] Apr 24 21:49:27.346255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:27.346200 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podStartSLOduration=1.346183239 podStartE2EDuration="1.346183239s" podCreationTimestamp="2026-04-24 21:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:27.345577145 +0000 UTC m=+1374.692483490" watchObservedRunningTime="2026-04-24 21:49:27.346183239 +0000 UTC m=+1374.693089586" Apr 24 21:49:29.146827 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:29.146791 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" path="/var/lib/kubelet/pods/aadad426-f0be-479b-9a3b-c0b06acec4c4/volumes" Apr 24 21:49:33.324069 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:33.324036 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:36.078701 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:36.078662 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:49:36.079110 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:36.078901 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" containerID="cri-o://f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de" gracePeriod=30 Apr 24 21:49:38.322984 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:38.322943 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:43.324305 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:43.324261 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:48.322845 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:48.322797 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:48.323276 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:48.322945 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:49:53.323129 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:53.323083 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:49:58.322699 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:49:58.322655 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:03.323126 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:03.323086 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:50:06.230164 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.230137 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:50:06.319762 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.319720 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle\") pod \"95d06567-f7c3-4722-a71e-23e7c86ee03f\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " Apr 24 21:50:06.319762 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.319763 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls\") pod \"95d06567-f7c3-4722-a71e-23e7c86ee03f\" (UID: \"95d06567-f7c3-4722-a71e-23e7c86ee03f\") " Apr 24 21:50:06.320136 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.320110 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "95d06567-f7c3-4722-a71e-23e7c86ee03f" (UID: "95d06567-f7c3-4722-a71e-23e7c86ee03f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:06.321933 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.321898 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "95d06567-f7c3-4722-a71e-23e7c86ee03f" (UID: "95d06567-f7c3-4722-a71e-23e7c86ee03f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:06.420847 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.420760 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d06567-f7c3-4722-a71e-23e7c86ee03f-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:50:06.420847 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.420790 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d06567-f7c3-4722-a71e-23e7c86ee03f-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:50:06.429696 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.429665 2574 generic.go:358] "Generic (PLEG): container finished" podID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerID="f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de" exitCode=0 Apr 24 21:50:06.429849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.429724 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" Apr 24 21:50:06.429849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.429746 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" event={"ID":"95d06567-f7c3-4722-a71e-23e7c86ee03f","Type":"ContainerDied","Data":"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de"} Apr 24 21:50:06.429849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.429781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7" event={"ID":"95d06567-f7c3-4722-a71e-23e7c86ee03f","Type":"ContainerDied","Data":"04254092e9910ce0bc4c78e8041498d5f45451155034d0d8b4a6de7957df609f"} Apr 24 21:50:06.429849 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.429796 2574 scope.go:117] "RemoveContainer" containerID="f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de" Apr 24 21:50:06.437957 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.437928 2574 scope.go:117] "RemoveContainer" containerID="f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de" Apr 24 21:50:06.438258 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:50:06.438240 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de\": container with ID starting with f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de not found: ID does not exist" containerID="f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de" Apr 24 21:50:06.438313 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.438267 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de"} err="failed to get container status \"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de\": rpc error: code = NotFound desc = could not find container \"f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de\": container with ID starting with f2fd196168298aa984bdc4dcab0c1ce068f1abeb6fa9c021fd9994bf545254de not found: ID does not exist" Apr 24 21:50:06.449845 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.449812 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:50:06.455066 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.455039 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-b9878-79b8776d89-hngl7"] Apr 24 21:50:06.507990 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.507949 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:50:06.508365 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508349 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" Apr 24 21:50:06.508455 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508369 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" Apr 24 21:50:06.508455 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508387 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" Apr 24 21:50:06.508455 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508395 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" Apr 24 21:50:06.508654 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508485 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aadad426-f0be-479b-9a3b-c0b06acec4c4" containerName="ensemble-graph-2525b" Apr 24 21:50:06.508654 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.508499 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" containerName="sequence-graph-b9878" Apr 24 21:50:06.512991 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.512964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:06.515167 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.515136 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-1dabb-kube-rbac-proxy-sar-config\"" Apr 24 21:50:06.515313 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.515177 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:50:06.515313 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.515179 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-6ztvm\"" Apr 24 21:50:06.515442 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.515400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-1dabb-serving-cert\"" Apr 24 21:50:06.521059 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.521035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:50:06.622247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.622208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:06.622247 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.622253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:06.722632 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.722603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:06.722632 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.722636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:06.722848 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:50:06.722755 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-1dabb-serving-cert: secret "ensemble-graph-1dabb-serving-cert" not found Apr 24 21:50:06.722848 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:50:06.722820 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls podName:420f11c7-ea6b-41a3-b4e9-64434becd5ba nodeName:}" failed. No retries permitted until 2026-04-24 21:50:07.222801788 +0000 UTC m=+1414.569708112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls") pod "ensemble-graph-1dabb-976bfb698-7kt9p" (UID: "420f11c7-ea6b-41a3-b4e9-64434becd5ba") : secret "ensemble-graph-1dabb-serving-cert" not found Apr 24 21:50:06.723235 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:06.723213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:07.146989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:07.146900 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d06567-f7c3-4722-a71e-23e7c86ee03f" path="/var/lib/kubelet/pods/95d06567-f7c3-4722-a71e-23e7c86ee03f/volumes" Apr 24 21:50:07.227179 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:07.227128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:07.229551 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:07.229525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") pod \"ensemble-graph-1dabb-976bfb698-7kt9p\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:07.424807 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:07.424694 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:07.565117 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:07.565081 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:50:07.568409 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:50:07.568379 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod420f11c7_ea6b_41a3_b4e9_64434becd5ba.slice/crio-2754085dcbd1a008cdf3b147698138331cc221979ff198e4fc631bb327fca879 WatchSource:0}: Error finding container 2754085dcbd1a008cdf3b147698138331cc221979ff198e4fc631bb327fca879: Status 404 returned error can't find the container with id 2754085dcbd1a008cdf3b147698138331cc221979ff198e4fc631bb327fca879 Apr 24 21:50:08.437560 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:08.437523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" event={"ID":"420f11c7-ea6b-41a3-b4e9-64434becd5ba","Type":"ContainerStarted","Data":"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74"} Apr 24 21:50:08.437560 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:08.437561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" event={"ID":"420f11c7-ea6b-41a3-b4e9-64434becd5ba","Type":"ContainerStarted","Data":"2754085dcbd1a008cdf3b147698138331cc221979ff198e4fc631bb327fca879"} Apr 24 21:50:08.438066 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:08.437628 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:08.455943 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:08.455879 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podStartSLOduration=2.455857809 podStartE2EDuration="2.455857809s" podCreationTimestamp="2026-04-24 21:50:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:08.455143707 +0000 UTC m=+1415.802050048" watchObservedRunningTime="2026-04-24 21:50:08.455857809 +0000 UTC m=+1415.802764158" Apr 24 21:50:14.446945 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:14.446862 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:50:36.282294 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.282256 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:50:36.286131 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.286103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.288593 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.288567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1ec3f-serving-cert\"" Apr 24 21:50:36.288593 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.288568 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-1ec3f-kube-rbac-proxy-sar-config\"" Apr 24 21:50:36.292749 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.292719 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:50:36.377423 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.377378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.377620 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.377507 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.478612 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.478575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.478800 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.478671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.478800 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:50:36.478736 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-1ec3f-serving-cert: secret "sequence-graph-1ec3f-serving-cert" not found Apr 24 21:50:36.478872 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:50:36.478820 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls podName:93c48cd8-ba73-462d-aaa2-e944e60e61dd nodeName:}" failed. No retries permitted until 2026-04-24 21:50:36.978795792 +0000 UTC m=+1444.325702131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls") pod "sequence-graph-1ec3f-5f6c4cf864-t84z5" (UID: "93c48cd8-ba73-462d-aaa2-e944e60e61dd") : secret "sequence-graph-1ec3f-serving-cert" not found Apr 24 21:50:36.479372 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.479354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.983795 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.983757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:36.986154 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:36.986129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") pod \"sequence-graph-1ec3f-5f6c4cf864-t84z5\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:37.197471 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.197432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:37.325996 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.325960 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:50:37.329005 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:50:37.328975 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c48cd8_ba73_462d_aaa2_e944e60e61dd.slice/crio-1b3fad9ba61e2232dcadede132880bed156c927d6f2d4d2f971afb42adcce2d0 WatchSource:0}: Error finding container 1b3fad9ba61e2232dcadede132880bed156c927d6f2d4d2f971afb42adcce2d0: Status 404 returned error can't find the container with id 1b3fad9ba61e2232dcadede132880bed156c927d6f2d4d2f971afb42adcce2d0 Apr 24 21:50:37.526798 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.526697 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" event={"ID":"93c48cd8-ba73-462d-aaa2-e944e60e61dd","Type":"ContainerStarted","Data":"eee422b287f2391983bd91429f42704f59df68a88be0086a15c8fac9600077de"} Apr 24 21:50:37.526798 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.526748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" event={"ID":"93c48cd8-ba73-462d-aaa2-e944e60e61dd","Type":"ContainerStarted","Data":"1b3fad9ba61e2232dcadede132880bed156c927d6f2d4d2f971afb42adcce2d0"} Apr 24 21:50:37.526798 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.526788 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:50:37.544917 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:37.544852 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podStartSLOduration=1.54483444 podStartE2EDuration="1.54483444s" podCreationTimestamp="2026-04-24 21:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:37.543717701 +0000 UTC m=+1444.890624048" watchObservedRunningTime="2026-04-24 21:50:37.54483444 +0000 UTC m=+1444.891740785" Apr 24 21:50:43.535629 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:50:43.535600 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:51:33.184211 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:51:33.184184 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:51:33.189080 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:51:33.189057 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:51:33.190459 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:51:33.190440 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:51:33.195038 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:51:33.195020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:56:33.210180 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:56:33.210024 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:56:33.214292 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:56:33.214265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 21:56:33.215960 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:56:33.215938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:56:33.219953 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:56:33.219926 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 21:58:21.104683 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:21.104645 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:58:21.105286 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:21.104979 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" containerID="cri-o://abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74" gracePeriod=30 Apr 24 21:58:24.445244 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:24.445186 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:29.445063 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:29.445023 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:34.444448 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:34.444408 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:34.444870 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:34.444553 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:58:39.444995 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:39.444949 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:44.444565 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:44.444520 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:49.444684 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:49.444642 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:51.007909 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.007869 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:58:51.008345 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.008100 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" containerID="cri-o://eee422b287f2391983bd91429f42704f59df68a88be0086a15c8fac9600077de" gracePeriod=30 Apr 24 21:58:51.260192 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.260125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:58:51.365989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.365962 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle\") pod \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " Apr 24 21:58:51.366194 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.366061 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") pod \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\" (UID: \"420f11c7-ea6b-41a3-b4e9-64434becd5ba\") " Apr 24 21:58:51.366349 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.366306 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "420f11c7-ea6b-41a3-b4e9-64434becd5ba" (UID: "420f11c7-ea6b-41a3-b4e9-64434becd5ba"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:58:51.368105 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.368073 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "420f11c7-ea6b-41a3-b4e9-64434becd5ba" (UID: "420f11c7-ea6b-41a3-b4e9-64434becd5ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:58:51.466673 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.466631 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/420f11c7-ea6b-41a3-b4e9-64434becd5ba-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:58:51.466673 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:51.466667 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420f11c7-ea6b-41a3-b4e9-64434becd5ba-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:58:52.025792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.025759 2574 generic.go:358] "Generic (PLEG): container finished" podID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerID="abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74" exitCode=0 Apr 24 21:58:52.026255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.025806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" event={"ID":"420f11c7-ea6b-41a3-b4e9-64434becd5ba","Type":"ContainerDied","Data":"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74"} Apr 24 21:58:52.026255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.025822 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" Apr 24 21:58:52.026255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.025840 2574 scope.go:117] "RemoveContainer" containerID="abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74" Apr 24 21:58:52.026255 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.025830 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p" event={"ID":"420f11c7-ea6b-41a3-b4e9-64434becd5ba","Type":"ContainerDied","Data":"2754085dcbd1a008cdf3b147698138331cc221979ff198e4fc631bb327fca879"} Apr 24 21:58:52.034096 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.034073 2574 scope.go:117] "RemoveContainer" containerID="abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74" Apr 24 21:58:52.034373 ip-10-0-133-48 kubenswrapper[2574]: E0424 21:58:52.034355 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74\": container with ID starting with abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74 not found: ID does not exist" containerID="abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74" Apr 24 21:58:52.034432 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.034382 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74"} err="failed to get container status \"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74\": rpc error: code = NotFound desc = could not find container \"abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74\": container with ID starting with abd127a5c942c69460fd3fd9eb088e1b205235a7f02d8e1d77141a209a778a74 not found: ID does not exist" Apr 24 21:58:52.046204 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.046173 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:58:52.052170 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:52.052141 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-1dabb-976bfb698-7kt9p"] Apr 24 21:58:53.148041 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:53.148000 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" path="/var/lib/kubelet/pods/420f11c7-ea6b-41a3-b4e9-64434becd5ba/volumes" Apr 24 21:58:53.534427 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:53.534388 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:58:58.534124 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:58:58.534086 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:03.534204 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:03.534161 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:03.534619 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:03.534270 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:59:08.533989 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:08.533945 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:13.534388 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:13.534276 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:18.534095 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:18.534044 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:21.121207 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.121175 2574 generic.go:358] "Generic (PLEG): container finished" podID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerID="eee422b287f2391983bd91429f42704f59df68a88be0086a15c8fac9600077de" exitCode=0 Apr 24 21:59:21.121628 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.121250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" event={"ID":"93c48cd8-ba73-462d-aaa2-e944e60e61dd","Type":"ContainerDied","Data":"eee422b287f2391983bd91429f42704f59df68a88be0086a15c8fac9600077de"} Apr 24 21:59:21.387757 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.387678 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 21:59:21.388043 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.388029 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" Apr 24 21:59:21.388043 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.388044 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" Apr 24 21:59:21.388120 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.388104 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="420f11c7-ea6b-41a3-b4e9-64434becd5ba" containerName="ensemble-graph-1dabb" Apr 24 21:59:21.390966 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.390948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.393607 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.393588 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a4170-serving-cert\"" Apr 24 21:59:21.393720 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.393666 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a4170-kube-rbac-proxy-sar-config\"" Apr 24 21:59:21.400237 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.400210 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 21:59:21.422162 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.422123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.422362 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.422230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.523636 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.523608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.523713 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.523656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.524253 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.524235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.526142 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.526114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls\") pod \"splitter-graph-a4170-65f94cfc4d-7lwc7\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.646051 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.645985 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:59:21.701926 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.701883 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:21.726041 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.726007 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") pod \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " Apr 24 21:59:21.726225 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.726049 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle\") pod \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\" (UID: \"93c48cd8-ba73-462d-aaa2-e944e60e61dd\") " Apr 24 21:59:21.726452 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.726424 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "93c48cd8-ba73-462d-aaa2-e944e60e61dd" (UID: "93c48cd8-ba73-462d-aaa2-e944e60e61dd"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:59:21.728009 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.727983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "93c48cd8-ba73-462d-aaa2-e944e60e61dd" (UID: "93c48cd8-ba73-462d-aaa2-e944e60e61dd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:59:21.826169 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.826022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 21:59:21.830804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.827076 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/93c48cd8-ba73-462d-aaa2-e944e60e61dd-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:59:21.830804 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.827134 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c48cd8-ba73-462d-aaa2-e944e60e61dd-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 21:59:21.833079 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:21.831939 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:59:22.124986 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.124947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" event={"ID":"93c48cd8-ba73-462d-aaa2-e944e60e61dd","Type":"ContainerDied","Data":"1b3fad9ba61e2232dcadede132880bed156c927d6f2d4d2f971afb42adcce2d0"} Apr 24 21:59:22.124986 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.124993 2574 scope.go:117] "RemoveContainer" containerID="eee422b287f2391983bd91429f42704f59df68a88be0086a15c8fac9600077de" Apr 24 21:59:22.125515 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.124989 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5" Apr 24 21:59:22.126389 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.126358 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" event={"ID":"3d18794b-cdfc-4b07-9712-3c52c56b0a54","Type":"ContainerStarted","Data":"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc"} Apr 24 21:59:22.126557 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.126394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" event={"ID":"3d18794b-cdfc-4b07-9712-3c52c56b0a54","Type":"ContainerStarted","Data":"c62ebfd6e70a53bd8a04a9fd76752aa6d64b5ea339f964c719e93b3d3139599e"} Apr 24 21:59:22.126557 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.126486 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:22.151864 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.151806 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podStartSLOduration=1.1517905609999999 podStartE2EDuration="1.151790561s" podCreationTimestamp="2026-04-24 21:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:22.148124953 +0000 UTC m=+1969.495031298" watchObservedRunningTime="2026-04-24 21:59:22.151790561 +0000 UTC m=+1969.498696906" Apr 24 21:59:22.166381 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.166345 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:59:22.174367 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:22.174324 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-1ec3f-5f6c4cf864-t84z5"] Apr 24 21:59:23.147301 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:23.147268 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" path="/var/lib/kubelet/pods/93c48cd8-ba73-462d-aaa2-e944e60e61dd/volumes" Apr 24 21:59:28.136558 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:28.136529 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:31.437424 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:31.437391 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 21:59:31.437871 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:31.437595 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" containerID="cri-o://6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc" gracePeriod=30 Apr 24 21:59:33.135041 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:33.134991 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:38.134981 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:38.134934 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:43.134795 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:43.134756 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:43.135190 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:43.134871 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 21:59:48.135276 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:48.135233 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:51.249928 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.249878 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 21:59:51.250702 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.250679 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" Apr 24 21:59:51.250792 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.250704 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" Apr 24 21:59:51.250863 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.250850 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="93c48cd8-ba73-462d-aaa2-e944e60e61dd" containerName="sequence-graph-1ec3f" Apr 24 21:59:51.258515 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.258487 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.261250 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.261219 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-f4deb-serving-cert\"" Apr 24 21:59:51.261250 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.261247 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-f4deb-kube-rbac-proxy-sar-config\"" Apr 24 21:59:51.261576 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.261553 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 21:59:51.387189 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.387151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.387189 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.387192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.487796 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.487757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.487796 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.487799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.488440 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.488421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.490254 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.490234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls\") pod \"switch-graph-f4deb-6fd5579b57-5tldk\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.569963 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.569867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:51.695829 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:51.695805 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 21:59:51.698578 ip-10-0-133-48 kubenswrapper[2574]: W0424 21:59:51.698548 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9a1e05_3423_4c72_aa72_f8c7e0e53360.slice/crio-2f60d36adeb1bbfaa1d0b8d8da31fc50de8fd2705d56d1ac3bc47348e5cf3401 WatchSource:0}: Error finding container 2f60d36adeb1bbfaa1d0b8d8da31fc50de8fd2705d56d1ac3bc47348e5cf3401: Status 404 returned error can't find the container with id 2f60d36adeb1bbfaa1d0b8d8da31fc50de8fd2705d56d1ac3bc47348e5cf3401 Apr 24 21:59:52.219324 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:52.219285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" event={"ID":"7b9a1e05-3423-4c72-aa72-f8c7e0e53360","Type":"ContainerStarted","Data":"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e"} Apr 24 21:59:52.219324 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:52.219342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" event={"ID":"7b9a1e05-3423-4c72-aa72-f8c7e0e53360","Type":"ContainerStarted","Data":"2f60d36adeb1bbfaa1d0b8d8da31fc50de8fd2705d56d1ac3bc47348e5cf3401"} Apr 24 21:59:52.219582 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:52.219362 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 21:59:52.239352 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:52.239285 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podStartSLOduration=1.239268503 podStartE2EDuration="1.239268503s" podCreationTimestamp="2026-04-24 21:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:52.237598542 +0000 UTC m=+1999.584504888" watchObservedRunningTime="2026-04-24 21:59:52.239268503 +0000 UTC m=+1999.586174828" Apr 24 21:59:53.134618 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:53.134579 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:58.135278 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:58.135236 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:59:58.228325 ip-10-0-133-48 kubenswrapper[2574]: I0424 21:59:58.228294 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 22:00:01.623417 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.623386 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 22:00:01.678230 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.678191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle\") pod \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " Apr 24 22:00:01.678433 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.678264 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls\") pod \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\" (UID: \"3d18794b-cdfc-4b07-9712-3c52c56b0a54\") " Apr 24 22:00:01.678611 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.678584 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3d18794b-cdfc-4b07-9712-3c52c56b0a54" (UID: "3d18794b-cdfc-4b07-9712-3c52c56b0a54"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:00:01.680322 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.680298 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3d18794b-cdfc-4b07-9712-3c52c56b0a54" (UID: "3d18794b-cdfc-4b07-9712-3c52c56b0a54"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:00:01.778994 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.778890 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d18794b-cdfc-4b07-9712-3c52c56b0a54-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:00:01.778994 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:01.778936 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d18794b-cdfc-4b07-9712-3c52c56b0a54-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:00:02.248400 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.248357 2574 generic.go:358] "Generic (PLEG): container finished" podID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerID="6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc" exitCode=0 Apr 24 22:00:02.248611 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.248422 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" Apr 24 22:00:02.248611 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.248435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" event={"ID":"3d18794b-cdfc-4b07-9712-3c52c56b0a54","Type":"ContainerDied","Data":"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc"} Apr 24 22:00:02.248611 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.248482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7" event={"ID":"3d18794b-cdfc-4b07-9712-3c52c56b0a54","Type":"ContainerDied","Data":"c62ebfd6e70a53bd8a04a9fd76752aa6d64b5ea339f964c719e93b3d3139599e"} Apr 24 22:00:02.248611 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.248504 2574 scope.go:117] "RemoveContainer" containerID="6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc" Apr 24 22:00:02.256794 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.256772 2574 scope.go:117] "RemoveContainer" containerID="6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc" Apr 24 22:00:02.257059 ip-10-0-133-48 kubenswrapper[2574]: E0424 22:00:02.257041 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc\": container with ID starting with 6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc not found: ID does not exist" containerID="6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc" Apr 24 22:00:02.257122 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.257073 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc"} err="failed to get container status \"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc\": rpc error: code = NotFound desc = could not find container \"6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc\": container with ID starting with 6d171636543a8f5623f33f8a805953dd4c0a7b31216e9f238c983c5ea9445cbc not found: ID does not exist" Apr 24 22:00:02.272602 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.272569 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 22:00:02.275316 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:02.275291 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a4170-65f94cfc4d-7lwc7"] Apr 24 22:00:03.147173 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:03.147125 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" path="/var/lib/kubelet/pods/3d18794b-cdfc-4b07-9712-3c52c56b0a54/volumes" Apr 24 22:00:31.657197 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.657160 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:00:31.657599 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.657489 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" Apr 24 22:00:31.657599 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.657501 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" Apr 24 22:00:31.657599 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.657562 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d18794b-cdfc-4b07-9712-3c52c56b0a54" containerName="splitter-graph-a4170" Apr 24 22:00:31.660395 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.660374 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.662633 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.662606 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-600f1-serving-cert\"" Apr 24 22:00:31.662742 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.662634 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-600f1-kube-rbac-proxy-sar-config\"" Apr 24 22:00:31.669089 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.669061 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:00:31.738576 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.738537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.738775 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.738601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.839087 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.839057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.839298 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.839143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.839883 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.839856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.841595 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.841564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls\") pod \"splitter-graph-600f1-6878b7dd5c-cttln\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:31.971070 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:31.971037 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:32.094820 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:32.094783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:00:32.097724 ip-10-0-133-48 kubenswrapper[2574]: W0424 22:00:32.097696 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd12d41f_dd99_4536_b3e7_da6f0745ca15.slice/crio-b54285810a09776994b6c95eb606aec1d44c4f29ea6bc3d854a004834d5aa5a5 WatchSource:0}: Error finding container b54285810a09776994b6c95eb606aec1d44c4f29ea6bc3d854a004834d5aa5a5: Status 404 returned error can't find the container with id b54285810a09776994b6c95eb606aec1d44c4f29ea6bc3d854a004834d5aa5a5 Apr 24 22:00:32.348079 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:32.347978 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" event={"ID":"dd12d41f-dd99-4536-b3e7-da6f0745ca15","Type":"ContainerStarted","Data":"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b"} Apr 24 22:00:32.348079 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:32.348016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" event={"ID":"dd12d41f-dd99-4536-b3e7-da6f0745ca15","Type":"ContainerStarted","Data":"b54285810a09776994b6c95eb606aec1d44c4f29ea6bc3d854a004834d5aa5a5"} Apr 24 22:00:32.348079 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:32.348039 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:00:32.370371 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:32.370141 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podStartSLOduration=1.3701205189999999 podStartE2EDuration="1.370120519s" podCreationTimestamp="2026-04-24 22:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:00:32.368967187 +0000 UTC m=+2039.715873533" watchObservedRunningTime="2026-04-24 22:00:32.370120519 +0000 UTC m=+2039.717026865" Apr 24 22:00:38.357942 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:00:38.357912 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:01:33.232926 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:01:33.232808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:01:33.237031 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:01:33.237005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:01:33.238668 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:01:33.238649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:01:33.242639 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:01:33.242618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:06:33.255174 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:06:33.255060 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:06:33.260766 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:06:33.260739 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:06:33.261218 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:06:33.261196 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:06:33.268661 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:06:33.268636 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:08:46.262541 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:46.262456 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:08:46.263046 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:46.262694 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" containerID="cri-o://82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b" gracePeriod=30 Apr 24 22:08:48.355640 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:48.355587 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:08:53.355909 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:53.355868 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:08:58.356151 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:58.356106 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:08:58.356573 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:08:58.356223 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:09:03.356486 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:03.356446 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:09:08.355391 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:08.355347 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:09:13.356068 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:13.356032 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:09:16.413925 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.413899 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:09:16.537740 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.537647 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls\") pod \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " Apr 24 22:09:16.537740 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.537701 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle\") pod \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\" (UID: \"dd12d41f-dd99-4536-b3e7-da6f0745ca15\") " Apr 24 22:09:16.538081 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.538055 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "dd12d41f-dd99-4536-b3e7-da6f0745ca15" (UID: "dd12d41f-dd99-4536-b3e7-da6f0745ca15"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:09:16.539811 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.539783 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dd12d41f-dd99-4536-b3e7-da6f0745ca15" (UID: "dd12d41f-dd99-4536-b3e7-da6f0745ca15"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:16.638694 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.638649 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd12d41f-dd99-4536-b3e7-da6f0745ca15-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:09:16.638694 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.638688 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd12d41f-dd99-4536-b3e7-da6f0745ca15-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:09:16.932087 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.932003 2574 generic.go:358] "Generic (PLEG): container finished" podID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerID="82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b" exitCode=0 Apr 24 22:09:16.932087 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.932072 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" Apr 24 22:09:16.932270 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.932091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" event={"ID":"dd12d41f-dd99-4536-b3e7-da6f0745ca15","Type":"ContainerDied","Data":"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b"} Apr 24 22:09:16.932270 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.932132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln" event={"ID":"dd12d41f-dd99-4536-b3e7-da6f0745ca15","Type":"ContainerDied","Data":"b54285810a09776994b6c95eb606aec1d44c4f29ea6bc3d854a004834d5aa5a5"} Apr 24 22:09:16.932270 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.932154 2574 scope.go:117] "RemoveContainer" containerID="82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b" Apr 24 22:09:16.940815 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.940796 2574 scope.go:117] "RemoveContainer" containerID="82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b" Apr 24 22:09:16.941091 ip-10-0-133-48 kubenswrapper[2574]: E0424 22:09:16.941070 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b\": container with ID starting with 82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b not found: ID does not exist" containerID="82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b" Apr 24 22:09:16.941139 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.941102 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b"} err="failed to get container status \"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b\": rpc error: code = NotFound desc = could not find container \"82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b\": container with ID starting with 82740a7669ff2472dac9ce2a35d7fff681e25ac4eceeaaaf284fac91896cef4b not found: ID does not exist" Apr 24 22:09:16.952812 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.952778 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:09:16.955797 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:16.955776 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-600f1-6878b7dd5c-cttln"] Apr 24 22:09:17.147250 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:09:17.147207 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" path="/var/lib/kubelet/pods/dd12d41f-dd99-4536-b3e7-da6f0745ca15/volumes" Apr 24 22:11:33.284135 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:11:33.284027 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:11:33.288064 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:11:33.287659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:11:33.290446 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:11:33.290425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:11:33.293480 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:11:33.293459 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:16:10.548224 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:10.548190 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 22:16:10.548724 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:10.548448 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" containerID="cri-o://9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e" gracePeriod=30 Apr 24 22:16:11.323035 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.322986 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g6c8f/must-gather-s5hfd"] Apr 24 22:16:11.323344 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.323317 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" Apr 24 22:16:11.323395 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.323345 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" Apr 24 22:16:11.323434 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.323397 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd12d41f-dd99-4536-b3e7-da6f0745ca15" containerName="splitter-graph-600f1" Apr 24 22:16:11.326450 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.326426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.329087 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.329070 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6c8f\"/\"kube-root-ca.crt\"" Apr 24 22:16:11.329204 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.329070 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g6c8f\"/\"default-dockercfg-6g24x\"" Apr 24 22:16:11.329204 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.329066 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6c8f\"/\"openshift-service-ca.crt\"" Apr 24 22:16:11.350632 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.350600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6c8f/must-gather-s5hfd"] Apr 24 22:16:11.429067 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.429025 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.429252 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.429153 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kl97\" (UniqueName: \"kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.530243 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.530208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kl97\" (UniqueName: \"kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.530365 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.530250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.530591 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.530574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.538919 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.538883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kl97\" (UniqueName: \"kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97\") pod \"must-gather-s5hfd\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.636287 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.636184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:11.753975 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.753881 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6c8f/must-gather-s5hfd"] Apr 24 22:16:11.756696 ip-10-0-133-48 kubenswrapper[2574]: W0424 22:16:11.756667 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e3a86f_4101_492c_b63d_44b0b9e6ed74.slice/crio-0381428d523c5e76f601c0a9bfe246b1e395f9abdfd5bc729c7f37b38ad3ca58 WatchSource:0}: Error finding container 0381428d523c5e76f601c0a9bfe246b1e395f9abdfd5bc729c7f37b38ad3ca58: Status 404 returned error can't find the container with id 0381428d523c5e76f601c0a9bfe246b1e395f9abdfd5bc729c7f37b38ad3ca58 Apr 24 22:16:11.758404 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:11.758382 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:16:12.213415 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:12.213378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" event={"ID":"22e3a86f-4101-492c-b63d-44b0b9e6ed74","Type":"ContainerStarted","Data":"0381428d523c5e76f601c0a9bfe246b1e395f9abdfd5bc729c7f37b38ad3ca58"} Apr 24 22:16:13.227779 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:13.227687 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:16.228975 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:16.228940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" event={"ID":"22e3a86f-4101-492c-b63d-44b0b9e6ed74","Type":"ContainerStarted","Data":"87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef"} Apr 24 22:16:16.229466 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:16.228984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" event={"ID":"22e3a86f-4101-492c-b63d-44b0b9e6ed74","Type":"ContainerStarted","Data":"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1"} Apr 24 22:16:16.247139 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:16.247075 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" podStartSLOduration=1.32078505 podStartE2EDuration="5.24705667s" podCreationTimestamp="2026-04-24 22:16:11 +0000 UTC" firstStartedPulling="2026-04-24 22:16:11.758568927 +0000 UTC m=+2979.105475254" lastFinishedPulling="2026-04-24 22:16:15.68484055 +0000 UTC m=+2983.031746874" observedRunningTime="2026-04-24 22:16:16.245654773 +0000 UTC m=+2983.592561118" watchObservedRunningTime="2026-04-24 22:16:16.24705667 +0000 UTC m=+2983.593963015" Apr 24 22:16:18.226298 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:18.226261 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:23.227046 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:23.227006 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:23.227468 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:23.227178 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 22:16:24.735285 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:24.735253 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:25.598010 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:25.597974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:26.467494 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:26.467467 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:27.261738 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:27.261704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:28.045236 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:28.045210 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:28.227484 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:28.227442 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:28.820696 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:28.820658 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:29.625835 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:29.625804 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:30.423089 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:30.423052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:31.230964 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:31.230935 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:32.036121 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:32.036082 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:32.862216 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:32.862181 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:33.225986 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.225948 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:33.313073 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.312949 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:16:33.655237 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.315642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:16:33.655237 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.322492 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:16:33.655237 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.325064 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:16:33.716737 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:33.716706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-f4deb-6fd5579b57-5tldk_7b9a1e05-3423-4c72-aa72-f8c7e0e53360/switch-graph-f4deb/0.log" Apr 24 22:16:35.289587 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:35.289554 2574 generic.go:358] "Generic (PLEG): container finished" podID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerID="94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1" exitCode=0 Apr 24 22:16:35.290113 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:35.289607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" event={"ID":"22e3a86f-4101-492c-b63d-44b0b9e6ed74","Type":"ContainerDied","Data":"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1"} Apr 24 22:16:35.290113 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:35.290014 2574 scope.go:117] "RemoveContainer" containerID="94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1" Apr 24 22:16:35.418746 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:35.418720 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g6c8f_must-gather-s5hfd_22e3a86f-4101-492c-b63d-44b0b9e6ed74/gather/0.log" Apr 24 22:16:38.226297 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:38.226255 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:16:38.685188 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:38.685161 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-w7g56_de5ae8c4-d942-404b-a27a-2ba51dd2184a/global-pull-secret-syncer/0.log" Apr 24 22:16:38.761899 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:38.761860 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rdccs_0591fbe5-106c-49b3-9a9e-61b9d9370f91/konnectivity-agent/0.log" Apr 24 22:16:38.863152 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:38.863122 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-48.ec2.internal_be2bfd82c265058eef2da37b1062af3f/haproxy/0.log" Apr 24 22:16:40.679234 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.679211 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 22:16:40.794572 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.794536 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls\") pod \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " Apr 24 22:16:40.794755 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.794606 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle\") pod \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\" (UID: \"7b9a1e05-3423-4c72-aa72-f8c7e0e53360\") " Apr 24 22:16:40.795010 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.794983 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7b9a1e05-3423-4c72-aa72-f8c7e0e53360" (UID: "7b9a1e05-3423-4c72-aa72-f8c7e0e53360"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:16:40.796790 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.796741 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b9a1e05-3423-4c72-aa72-f8c7e0e53360" (UID: "7b9a1e05-3423-4c72-aa72-f8c7e0e53360"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:16:40.856300 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.856206 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-g6c8f/must-gather-s5hfd"] Apr 24 22:16:40.856525 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.856500 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="copy" containerID="cri-o://87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef" gracePeriod=2 Apr 24 22:16:40.862353 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.862308 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-g6c8f/must-gather-s5hfd"] Apr 24 22:16:40.896038 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.896009 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-proxy-tls\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:16:40.896038 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:40.896036 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b9a1e05-3423-4c72-aa72-f8c7e0e53360-openshift-service-ca-bundle\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:16:41.069410 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.069386 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g6c8f_must-gather-s5hfd_22e3a86f-4101-492c-b63d-44b0b9e6ed74/copy/0.log" Apr 24 22:16:41.069720 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.069706 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:41.071764 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.071738 2574 status_manager.go:895] "Failed to get status for pod" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" err="pods \"must-gather-s5hfd\" is forbidden: User \"system:node:ip-10-0-133-48.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-g6c8f\": no relationship found between node 'ip-10-0-133-48.ec2.internal' and this object" Apr 24 22:16:41.198603 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.198559 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output\") pod \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " Apr 24 22:16:41.198801 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.198620 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kl97\" (UniqueName: \"kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97\") pod \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\" (UID: \"22e3a86f-4101-492c-b63d-44b0b9e6ed74\") " Apr 24 22:16:41.200129 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.200097 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "22e3a86f-4101-492c-b63d-44b0b9e6ed74" (UID: "22e3a86f-4101-492c-b63d-44b0b9e6ed74"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:41.200944 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.200925 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97" (OuterVolumeSpecName: "kube-api-access-5kl97") pod "22e3a86f-4101-492c-b63d-44b0b9e6ed74" (UID: "22e3a86f-4101-492c-b63d-44b0b9e6ed74"). InnerVolumeSpecName "kube-api-access-5kl97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:41.299643 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.299602 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/22e3a86f-4101-492c-b63d-44b0b9e6ed74-must-gather-output\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:16:41.299643 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.299634 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5kl97\" (UniqueName: \"kubernetes.io/projected/22e3a86f-4101-492c-b63d-44b0b9e6ed74-kube-api-access-5kl97\") on node \"ip-10-0-133-48.ec2.internal\" DevicePath \"\"" Apr 24 22:16:41.307809 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.307774 2574 generic.go:358] "Generic (PLEG): container finished" podID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerID="9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e" exitCode=0 Apr 24 22:16:41.307959 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.307841 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" Apr 24 22:16:41.307959 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.307854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" event={"ID":"7b9a1e05-3423-4c72-aa72-f8c7e0e53360","Type":"ContainerDied","Data":"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e"} Apr 24 22:16:41.307959 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.307892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk" event={"ID":"7b9a1e05-3423-4c72-aa72-f8c7e0e53360","Type":"ContainerDied","Data":"2f60d36adeb1bbfaa1d0b8d8da31fc50de8fd2705d56d1ac3bc47348e5cf3401"} Apr 24 22:16:41.307959 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.307923 2574 scope.go:117] "RemoveContainer" containerID="9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e" Apr 24 22:16:41.309199 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.309131 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-g6c8f_must-gather-s5hfd_22e3a86f-4101-492c-b63d-44b0b9e6ed74/copy/0.log" Apr 24 22:16:41.309532 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.309510 2574 generic.go:358] "Generic (PLEG): container finished" podID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerID="87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef" exitCode=143 Apr 24 22:16:41.309635 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.309574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6c8f/must-gather-s5hfd" Apr 24 22:16:41.316577 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.316558 2574 scope.go:117] "RemoveContainer" containerID="9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e" Apr 24 22:16:41.316893 ip-10-0-133-48 kubenswrapper[2574]: E0424 22:16:41.316869 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e\": container with ID starting with 9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e not found: ID does not exist" containerID="9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e" Apr 24 22:16:41.317086 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.316906 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e"} err="failed to get container status \"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e\": rpc error: code = NotFound desc = could not find container \"9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e\": container with ID starting with 9bff2e0277fadac9aca9e089cce9b4259a6bf27d0e1d55aafd9c01c5eab8826e not found: ID does not exist" Apr 24 22:16:41.317086 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.316929 2574 scope.go:117] "RemoveContainer" containerID="87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef" Apr 24 22:16:41.323989 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.323966 2574 scope.go:117] "RemoveContainer" containerID="94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1" Apr 24 22:16:41.324044 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.323973 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 22:16:41.328618 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.328596 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-f4deb-6fd5579b57-5tldk"] Apr 24 22:16:41.335290 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.335271 2574 scope.go:117] "RemoveContainer" containerID="87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef" Apr 24 22:16:41.335673 ip-10-0-133-48 kubenswrapper[2574]: E0424 22:16:41.335652 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef\": container with ID starting with 87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef not found: ID does not exist" containerID="87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef" Apr 24 22:16:41.335758 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.335685 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef"} err="failed to get container status \"87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef\": rpc error: code = NotFound desc = could not find container \"87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef\": container with ID starting with 87e5a7f959bd8c124731230ba70b46fe3ab241de2922452b8506fc1857742bef not found: ID does not exist" Apr 24 22:16:41.335758 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.335712 2574 scope.go:117] "RemoveContainer" containerID="94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1" Apr 24 22:16:41.335985 ip-10-0-133-48 kubenswrapper[2574]: E0424 22:16:41.335968 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1\": container with ID starting with 94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1 not found: ID does not exist" containerID="94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1" Apr 24 22:16:41.336042 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:41.335991 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1"} err="failed to get container status \"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1\": rpc error: code = NotFound desc = could not find container \"94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1\": container with ID starting with 94d54d19da2da00802b4b435d4266906f000d336522b281859420215cac02ae1 not found: ID does not exist" Apr 24 22:16:42.122751 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.122723 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cf4ps_079d5544-a12a-4f44-b625-ddbc27905004/kube-state-metrics/0.log" Apr 24 22:16:42.149658 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.149629 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cf4ps_079d5544-a12a-4f44-b625-ddbc27905004/kube-rbac-proxy-main/0.log" Apr 24 22:16:42.171819 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.171786 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cf4ps_079d5544-a12a-4f44-b625-ddbc27905004/kube-rbac-proxy-self/0.log" Apr 24 22:16:42.224947 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.224918 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-dc4hl_8b155916-68a3-40b1-8d71-903f176840f4/monitoring-plugin/0.log" Apr 24 22:16:42.254203 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.254175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l4np_e5056461-495e-4986-b3fb-3519148ed518/node-exporter/0.log" Apr 24 22:16:42.276042 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.276006 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l4np_e5056461-495e-4986-b3fb-3519148ed518/kube-rbac-proxy/0.log" Apr 24 22:16:42.297820 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.297793 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l4np_e5056461-495e-4986-b3fb-3519148ed518/init-textfile/0.log" Apr 24 22:16:42.579361 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.579317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/prometheus/0.log" Apr 24 22:16:42.596145 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.596121 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/config-reloader/0.log" Apr 24 22:16:42.619156 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.619120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/thanos-sidecar/0.log" Apr 24 22:16:42.640974 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.640946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/kube-rbac-proxy-web/0.log" Apr 24 22:16:42.664479 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.664452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/kube-rbac-proxy/0.log" Apr 24 22:16:42.688403 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.688375 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/kube-rbac-proxy-thanos/0.log" Apr 24 22:16:42.709816 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.709793 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_69c2a0ff-61af-4bfd-9c4b-89e876dcc4c7/init-config-reloader/0.log" Apr 24 22:16:42.744876 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.744845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jx79l_d53ae6be-1641-41bb-8724-bcfe224ed319/prometheus-operator/0.log" Apr 24 22:16:42.763375 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.763348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jx79l_d53ae6be-1641-41bb-8724-bcfe224ed319/kube-rbac-proxy/0.log" Apr 24 22:16:42.894246 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.894168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/thanos-query/0.log" Apr 24 22:16:42.916409 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.916381 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/kube-rbac-proxy-web/0.log" Apr 24 22:16:42.939813 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.939788 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/kube-rbac-proxy/0.log" Apr 24 22:16:42.963720 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.963695 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/prom-label-proxy/0.log" Apr 24 22:16:42.987809 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:42.987783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/kube-rbac-proxy-rules/0.log" Apr 24 22:16:43.012708 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:43.012679 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6b5dc768d8-mcjn6_914ee558-60df-40ee-a269-3bed78eff9a0/kube-rbac-proxy-metrics/0.log" Apr 24 22:16:43.146641 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:43.146564 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" path="/var/lib/kubelet/pods/22e3a86f-4101-492c-b63d-44b0b9e6ed74/volumes" Apr 24 22:16:43.146991 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:43.146928 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" path="/var/lib/kubelet/pods/7b9a1e05-3423-4c72-aa72-f8c7e0e53360/volumes" Apr 24 22:16:44.188032 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:44.187998 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-jkddd_1cd5a898-ba76-4c36-ab46-16db7f1b61bd/networking-console-plugin/0.log" Apr 24 22:16:44.645635 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:44.645545 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/2.log" Apr 24 22:16:44.653123 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:44.653094 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jpjmc_219a5443-bbde-4ab4-bb73-46a6160644d2/console-operator/3.log" Apr 24 22:16:45.764286 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764249 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9"] Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764581 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764592 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764601 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="gather" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764607 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="gather" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764615 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="copy" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764620 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="copy" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764678 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="gather" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764689 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="22e3a86f-4101-492c-b63d-44b0b9e6ed74" containerName="copy" Apr 24 22:16:45.764702 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.764695 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b9a1e05-3423-4c72-aa72-f8c7e0e53360" containerName="switch-graph-f4deb" Apr 24 22:16:45.769825 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.769798 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.772375 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.772354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"openshift-service-ca.crt\"" Apr 24 22:16:45.772480 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.772410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7csvq\"/\"default-dockercfg-vv79p\"" Apr 24 22:16:45.772712 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.772690 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7csvq\"/\"kube-root-ca.crt\"" Apr 24 22:16:45.777375 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.777350 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9"] Apr 24 22:16:45.838372 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.838316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-lib-modules\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.838585 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.838396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-proc\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.838585 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.838468 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59lk\" (UniqueName: \"kubernetes.io/projected/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-kube-api-access-j59lk\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.838585 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.838525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-podres\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.838585 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.838558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-sys\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939674 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j59lk\" (UniqueName: \"kubernetes.io/projected/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-kube-api-access-j59lk\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939842 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-podres\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939842 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-podres\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939981 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-sys\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939981 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-sys\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939981 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-lib-modules\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.939981 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-lib-modules\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.940152 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.939992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-proc\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.940152 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.940081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-proc\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:45.947543 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:45.947505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59lk\" (UniqueName: \"kubernetes.io/projected/b4f79a7a-84bb-4d29-9120-d5df30cfe5f2-kube-api-access-j59lk\") pod \"perf-node-gather-daemonset-q5lm9\" (UID: \"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2\") " pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:46.079930 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.079841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:46.187932 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.187903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-drrd5_ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399/dns/0.log" Apr 24 22:16:46.200111 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.200081 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9"] Apr 24 22:16:46.204178 ip-10-0-133-48 kubenswrapper[2574]: W0424 22:16:46.204150 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb4f79a7a_84bb_4d29_9120_d5df30cfe5f2.slice/crio-e35e99f31b5514c758efe92f7ef1ec8353ff639af685e1349c64e92e3e9c25e6 WatchSource:0}: Error finding container e35e99f31b5514c758efe92f7ef1ec8353ff639af685e1349c64e92e3e9c25e6: Status 404 returned error can't find the container with id e35e99f31b5514c758efe92f7ef1ec8353ff639af685e1349c64e92e3e9c25e6 Apr 24 22:16:46.215348 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.215232 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-drrd5_ff1a9672-5ebe-4d07-ac1a-56fbbdfdc399/kube-rbac-proxy/0.log" Apr 24 22:16:46.327059 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.327025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" event={"ID":"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2","Type":"ContainerStarted","Data":"6aa7e4cb0d8ce1482fd423e4b6e3c129a9f22432ea647fce495eaf962d37cbae"} Apr 24 22:16:46.327059 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.327060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" event={"ID":"b4f79a7a-84bb-4d29-9120-d5df30cfe5f2","Type":"ContainerStarted","Data":"e35e99f31b5514c758efe92f7ef1ec8353ff639af685e1349c64e92e3e9c25e6"} Apr 24 22:16:46.327258 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.327086 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:46.344970 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.344879 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" podStartSLOduration=1.344762696 podStartE2EDuration="1.344762696s" podCreationTimestamp="2026-04-24 22:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:46.343942518 +0000 UTC m=+3013.690848863" watchObservedRunningTime="2026-04-24 22:16:46.344762696 +0000 UTC m=+3013.691669041" Apr 24 22:16:46.363471 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.363442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g2zr8_eab474e7-7b20-4ca7-aedb-49915bb5ec3e/dns-node-resolver/0.log" Apr 24 22:16:46.811846 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:46.811815 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pmknh_5c88e2e4-e223-40b9-b7e7-7cc6d01a43d7/node-ca/0.log" Apr 24 22:16:47.967491 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:47.967460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zq6rm_645772fe-eb62-4443-a6dd-6a10b3593053/serve-healthcheck-canary/0.log" Apr 24 22:16:48.334468 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:48.334385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mvrpd_c3ca6061-7a72-47f7-9755-4619b0e3b74e/kube-rbac-proxy/0.log" Apr 24 22:16:48.355445 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:48.355404 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mvrpd_c3ca6061-7a72-47f7-9755-4619b0e3b74e/exporter/0.log" Apr 24 22:16:48.376984 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:48.376950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mvrpd_c3ca6061-7a72-47f7-9755-4619b0e3b74e/extractor/0.log" Apr 24 22:16:50.797144 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:50.797112 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-nk48p_86417de4-f634-4847-b71e-359bc85539ec/s3-init/0.log" Apr 24 22:16:52.340314 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:52.340289 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7csvq/perf-node-gather-daemonset-q5lm9" Apr 24 22:16:54.638431 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:54.638394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mkr9h_e6cceb68-1e95-4268-ba54-7b7980ce8560/migrator/0.log" Apr 24 22:16:54.665708 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:54.665678 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-mkr9h_e6cceb68-1e95-4268-ba54-7b7980ce8560/graceful-termination/0.log" Apr 24 22:16:56.197508 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.197478 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:56.236419 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.236391 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/egress-router-binary-copy/0.log" Apr 24 22:16:56.283455 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.283428 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/cni-plugins/0.log" Apr 24 22:16:56.335429 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.335403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/bond-cni-plugin/0.log" Apr 24 22:16:56.380412 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.380380 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/routeoverride-cni/0.log" Apr 24 22:16:56.416404 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.416377 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/whereabouts-cni-bincopy/0.log" Apr 24 22:16:56.461483 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.461389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dxnlv_f9c892b6-3ea7-435d-a215-90c3211e772b/whereabouts-cni/0.log" Apr 24 22:16:56.756953 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.756869 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f9nvj_c6abad34-23d6-4992-ab13-a2cf5ff8141a/kube-multus/0.log" Apr 24 22:16:56.878440 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.878408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jtqkc_e2af22c1-baca-4054-87ff-daf77606438a/network-metrics-daemon/0.log" Apr 24 22:16:56.920573 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:56.920544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jtqkc_e2af22c1-baca-4054-87ff-daf77606438a/kube-rbac-proxy/0.log" Apr 24 22:16:58.342028 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.341992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-controller/0.log" Apr 24 22:16:58.361231 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.361201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/0.log" Apr 24 22:16:58.388243 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.388212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovn-acl-logging/1.log" Apr 24 22:16:58.419930 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.419893 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/kube-rbac-proxy-node/0.log" Apr 24 22:16:58.442209 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.442170 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:58.462169 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.462137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/northd/0.log" Apr 24 22:16:58.485204 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.485175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/nbdb/0.log" Apr 24 22:16:58.507955 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.507927 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/sbdb/0.log" Apr 24 22:16:58.678280 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:58.678183 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf75n_1a62288c-a1f0-46d2-b77f-e15d23159b1a/ovnkube-controller/0.log" Apr 24 22:16:59.702122 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:59.702093 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-7jkld_e8dd8df3-5e85-4602-8b9c-38eae175e30e/check-endpoints/0.log" Apr 24 22:16:59.770487 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:16:59.770460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wvpqz_d326ac93-1c28-465f-80fb-a44c5fd5cb0b/network-check-target-container/0.log" Apr 24 22:17:00.704504 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:17:00.704474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4fzdt_998844f7-6ade-465d-8041-762411d1f8e2/iptables-alerter/0.log" Apr 24 22:17:01.432422 ip-10-0-133-48 kubenswrapper[2574]: I0424 22:17:01.432376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nwk6j_33bbf579-be90-4be9-aa5b-30ac9af3f6d2/tuned/0.log"