Apr 16 18:02:05.767719 ip-10-0-134-133 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:02:06.219427 ip-10-0-134-133 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:06.219427 ip-10-0-134-133 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:02:06.219427 ip-10-0-134-133 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:06.219427 ip-10-0-134-133 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:02:06.219427 ip-10-0-134-133 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:02:06.220222 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.220133 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:02:06.223195 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223180 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:06.223195 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223195 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:06.223195 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223198 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223202 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223206 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223209 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223212 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223215 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223220 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223226 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223229 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223232 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223235 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223238 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223241 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223243 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223246 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223249 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223261 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223264 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223267 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:06.223289 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223269 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223272 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223275 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223278 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223280 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223283 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223286 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223289 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223292 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223295 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223298 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223301 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223304 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223307 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223310 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223313 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223316 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223318 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223321 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223324 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:06.223759 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223326 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223329 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223331 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223334 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223336 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223339 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223341 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223345 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223347 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223350 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223352 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223355 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223358 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223361 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223364 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223380 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223382 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223385 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223388 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:06.224249 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223391 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223394 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223397 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223399 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223402 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223404 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223407 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223410 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223412 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223415 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223417 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223420 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223423 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223425 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223429 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223432 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223434 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223437 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223439 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223443 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:06.224748 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223445 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223450 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223454 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223456 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223459 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.223462 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225055 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225064 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225068 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225072 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225075 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225078 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225080 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225083 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225086 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225088 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225091 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225093 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225096 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:06.225228 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225098 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225101 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225103 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225106 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225108 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225111 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225113 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225116 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225119 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225122 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225124 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225127 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225131 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225134 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225137 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225141 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225144 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225147 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225149 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:06.225751 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225152 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225154 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225157 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225159 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225162 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225164 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225167 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225170 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225172 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225175 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225178 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225180 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225183 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225186 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225188 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225190 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225193 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225195 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225198 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225202 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:06.226224 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225204 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225207 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225211 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225213 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225216 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225219 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225221 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225224 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225227 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225230 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225232 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225236 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225239 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225242 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225244 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225247 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225250 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225252 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225255 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225258 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:06.226718 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225260 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225263 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225266 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225268 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225271 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225273 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225276 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225278 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225281 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225283 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225286 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225288 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225291 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.225293 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225385 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225395 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225404 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225410 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225416 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225419 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225425 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225430 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:02:06.227210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225433 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225437 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225440 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225444 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225447 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225450 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225453 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225456 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225459 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225462 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225465 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225470 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225474 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225477 2577 flags.go:64] FLAG: --config-dir="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225480 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225484 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225488 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225491 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225494 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225497 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225501 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225504 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225507 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225510 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225514 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:02:06.227767 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225519 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225522 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225525 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225528 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225531 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225534 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225539 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225542 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225545 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225548 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225551 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225555 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225558 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225561 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225564 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225567 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225570 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225573 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225576 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225579 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225582 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225585 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225588 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225592 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225595 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:02:06.228358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225598 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225601 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225604 2577 flags.go:64] FLAG: --help="false" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225607 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225611 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225614 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225617 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225621 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225625 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225628 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225631 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225634 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225637 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225640 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225644 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225647 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225650 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225653 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225656 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225659 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225662 2577 flags.go:64] FLAG: --lock-file="" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225664 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225667 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225670 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:02:06.229017 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225676 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225679 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225682 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225685 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225688 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225691 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225694 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225697 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225701 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225705 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225709 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225712 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225715 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225718 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225721 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225725 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225728 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225732 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225741 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225744 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225748 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225751 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225754 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:02:06.229604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225760 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225763 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225767 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225770 2577 flags.go:64] FLAG: --port="10250" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225773 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225777 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0337b09557b924c76" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225780 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225783 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225786 2577 flags.go:64] FLAG: --register-node="true" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225789 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225792 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225796 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225799 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225802 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225804 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225808 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225811 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225814 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225817 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225821 2577 flags.go:64] FLAG: --runonce="false" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225824 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225827 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225830 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225833 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225840 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225843 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:02:06.230180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225846 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225850 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225853 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225857 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225860 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225862 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225866 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225869 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225871 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225877 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225880 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225883 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225887 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225890 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225893 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225896 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225899 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225902 2577 flags.go:64] FLAG: --v="2" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225906 2577 flags.go:64] FLAG: --version="false" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225910 2577 flags.go:64] FLAG: --vmodule="" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225915 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.225918 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226043 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226047 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:06.230855 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226050 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226054 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226056 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226059 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226062 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226065 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226069 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226072 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226074 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226077 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226080 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226085 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226088 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226090 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226093 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226096 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226099 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226101 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226104 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226106 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:06.231440 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226109 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226111 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226115 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226119 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226122 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226125 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226127 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226130 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226132 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226135 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226138 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226140 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226143 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226145 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226148 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226150 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226153 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226156 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226160 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:06.231994 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226163 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226165 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226168 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226171 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226175 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226178 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226182 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226185 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226188 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226191 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226194 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226197 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226200 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226204 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226207 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226209 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226212 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226214 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226217 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:06.232492 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226220 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226222 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226225 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226228 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226231 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226235 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226237 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226240 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226243 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226245 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226248 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226251 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226255 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226257 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226260 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226263 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226265 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226269 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226272 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226275 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:06.232978 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226278 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226280 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226283 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226285 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226288 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.226291 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:06.233489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.226951 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:06.234440 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.234388 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:02:06.234440 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.234408 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234460 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234466 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234469 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234472 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234475 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234478 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234481 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234483 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234486 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234488 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234491 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234494 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234497 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234500 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234503 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234505 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234508 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234511 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234515 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:06.234509 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234518 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234521 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234524 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234527 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234531 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234533 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234536 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234539 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234541 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234544 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234547 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234549 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234559 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234562 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234564 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234567 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234569 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234572 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234575 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234577 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:06.235005 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234579 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234582 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234585 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234587 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234590 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234593 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234595 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234598 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234602 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234605 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234608 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234610 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234613 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234615 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234619 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234622 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234625 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234627 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234630 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:06.235503 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234632 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234635 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234637 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234640 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234642 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234645 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234648 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234651 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234653 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234656 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234658 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234661 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234663 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234666 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234668 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234671 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234673 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234676 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234678 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234681 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:06.236012 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234683 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234686 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234689 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234691 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234694 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234698 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234703 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234706 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.234712 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234812 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234817 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234821 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234824 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234827 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234829 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:02:06.236520 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234832 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234835 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234837 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234840 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234843 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234846 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234848 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234851 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234854 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234858 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234862 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234865 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234868 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234871 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234873 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234876 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234879 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234882 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234884 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:02:06.236898 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234887 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234890 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234892 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234895 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234897 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234900 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234903 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234906 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234908 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234911 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234914 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234917 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234919 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234922 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234925 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234927 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234930 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234932 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234935 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234938 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:02:06.237354 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234941 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234943 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234946 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234949 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234952 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234956 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234959 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234962 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234964 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234968 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234970 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234973 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234975 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234978 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234981 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234983 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234986 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234989 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234991 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234994 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:02:06.237865 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234997 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.234999 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235002 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235004 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235007 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235010 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235013 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235015 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235018 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235021 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235023 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235026 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235029 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235032 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235034 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235037 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235039 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235042 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235044 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235047 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:02:06.238355 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:06.235049 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.235054 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.235829 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.237745 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.238734 2577 server.go:1019] "Starting client certificate rotation" Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.238830 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:06.238961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.238874 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:02:06.263065 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.263037 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:06.268891 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.268860 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:02:06.286282 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.286252 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:02:06.289209 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.289182 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:06.291429 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.291407 2577 log.go:25] "Validated CRI v1 image API" Apr 16 18:02:06.292704 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.292687 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:02:06.298153 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.298131 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7ad3098d-76c4-47bd-97b6-f39d50a1d07c:/dev/nvme0n1p3 d2e10b46-29f8-430d-aacf-5b66bb89f221:/dev/nvme0n1p4] Apr 16 18:02:06.298226 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.298152 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:02:06.303877 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.303754 2577 manager.go:217] Machine: {Timestamp:2026-04-16 18:02:06.301964179 +0000 UTC m=+0.414771624 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098425 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cab10cfae81ce1ef6f2ebe160a995 SystemUUID:ec2cab10-cfae-81ce-1ef6-f2ebe160a995 BootID:ad4ff6ad-b87e-4a18-83c8-fb6ed26a5028 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:44:93:74:46:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:44:93:74:46:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:08:a5:a5:3a:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:02:06.303877 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.303863 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:02:06.304003 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.303960 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:02:06.305037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.305008 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:02:06.305183 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.305041 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-133.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:02:06.305231 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.305192 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:02:06.305231 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.305202 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:02:06.305231 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.305216 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:06.306602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.306590 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:02:06.307799 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.307787 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:06.307915 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.307905 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:02:06.310147 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.310135 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:02:06.310189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.310154 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:02:06.310189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.310169 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:02:06.310189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.310178 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:02:06.310189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.310188 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:02:06.311324 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.311311 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:06.311389 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.311331 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:02:06.316671 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.316651 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:02:06.318326 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.318311 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:02:06.319619 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319601 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:02:06.319664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319632 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:02:06.319664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319645 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:02:06.319664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319658 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319670 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319682 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319694 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319707 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319721 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:02:06.319747 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319733 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:02:06.319911 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319760 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:02:06.319911 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.319778 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:02:06.320585 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.320573 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:02:06.320623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.320586 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:02:06.324797 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.324761 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-133.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:02:06.325014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.324998 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:02:06.325066 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.325042 2577 server.go:1295] "Started kubelet" Apr 16 18:02:06.325459 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.325426 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:02:06.325548 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.325491 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:02:06.325622 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.325550 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:02:06.325922 ip-10-0-134-133 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:02:06.326569 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.326440 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:02:06.326569 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.326533 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:02:06.327212 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.327197 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:02:06.327609 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.327589 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:02:06.334272 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.334240 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:06.335440 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.335419 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:02:06.336350 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.336327 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.336786 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336761 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:02:06.336786 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336780 2577 factory.go:55] Registering systemd factory Apr 16 18:02:06.336786 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336787 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:02:06.336964 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336839 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:02:06.336964 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336852 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:02:06.336964 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336896 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:02:06.336964 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.336907 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:02:06.336964 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.336949 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:02:06.337176 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337074 2577 factory.go:153] Registering CRI-O factory Apr 16 18:02:06.337176 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337091 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 18:02:06.337176 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337166 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:02:06.337341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337191 2577 factory.go:103] Registering Raw factory Apr 16 18:02:06.337341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337207 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 18:02:06.337954 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.337936 2577 manager.go:319] Starting recovery of all containers Apr 16 18:02:06.339491 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.339459 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:02:06.339674 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.339648 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:02:06.340559 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.339438 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-133.ec2.internal.18a6e84e5deafef8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-133.ec2.internal,UID:ip-10-0-134-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-133.ec2.internal,},FirstTimestamp:2026-04-16 18:02:06.325014264 +0000 UTC m=+0.437821710,LastTimestamp:2026-04-16 18:02:06.325014264 +0000 UTC m=+0.437821710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-133.ec2.internal,}" Apr 16 18:02:06.349071 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.348930 2577 manager.go:324] Recovery completed Apr 16 18:02:06.353585 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.353569 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.356087 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356070 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.356139 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356110 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.356139 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356122 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.356673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356657 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:02:06.356733 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356673 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:02:06.356733 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.356690 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:02:06.358845 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.358771 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-133.ec2.internal.18a6e84e5fc52389 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-133.ec2.internal,UID:ip-10-0-134-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-133.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-133.ec2.internal,},FirstTimestamp:2026-04-16 18:02:06.356087689 +0000 UTC m=+0.468895140,LastTimestamp:2026-04-16 18:02:06.356087689 +0000 UTC m=+0.468895140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-133.ec2.internal,}" Apr 16 18:02:06.359359 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.359344 2577 policy_none.go:49] "None policy: Start" Apr 16 18:02:06.359420 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.359384 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:02:06.359420 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.359399 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:02:06.368210 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.368124 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-133.ec2.internal.18a6e84e5fc591da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-133.ec2.internal,UID:ip-10-0-134-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-133.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-133.ec2.internal,},FirstTimestamp:2026-04-16 18:02:06.35611593 +0000 UTC m=+0.468923375,LastTimestamp:2026-04-16 18:02:06.35611593 +0000 UTC m=+0.468923375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-133.ec2.internal,}" Apr 16 18:02:06.377213 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.377128 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-133.ec2.internal.18a6e84e5fc5ba76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-133.ec2.internal,UID:ip-10-0-134-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-133.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-133.ec2.internal,},FirstTimestamp:2026-04-16 18:02:06.356126326 +0000 UTC m=+0.468933770,LastTimestamp:2026-04-16 18:02:06.356126326 +0000 UTC m=+0.468933770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-133.ec2.internal,}" Apr 16 18:02:06.402994 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.402974 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.403010 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403020 2577 server.go:85] "Starting device plugin registration server" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403305 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403318 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403474 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403606 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403628 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.403782 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2snd" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.404122 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:02:06.408123 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.404158 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.409506 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.409488 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q2snd" Apr 16 18:02:06.470948 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.470863 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:02:06.472223 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.472200 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:02:06.472354 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.472236 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:02:06.472354 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.472261 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:02:06.472354 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.472271 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:02:06.472354 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.472314 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:02:06.475042 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.475019 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:06.504536 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.504502 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.505677 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.505650 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.505807 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.505688 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.505807 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.505698 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.505807 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.505726 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.515285 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.515265 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.515361 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.515292 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-133.ec2.internal\": node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.531290 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.531262 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.572558 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.572517 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal"] Apr 16 18:02:06.572752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.572620 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.573656 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.573640 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.573752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.573677 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.573752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.573693 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.574900 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.574885 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.575053 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.575097 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575068 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.575612 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575598 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.575678 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575626 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.575678 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575637 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.575678 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575599 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.575769 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575704 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.575769 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.575719 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.577243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.577226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.577318 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.577253 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:02:06.577967 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.577952 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:02:06.578045 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.577982 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:02:06.578045 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.577996 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:02:06.602280 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.602256 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-133.ec2.internal\" not found" node="ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.606809 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.606793 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-133.ec2.internal\" not found" node="ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.631992 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.631964 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.638326 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.638302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/550cd3d62a888e29b98d67015acd7a34-config\") pod \"kube-apiserver-proxy-ip-10-0-134-133.ec2.internal\" (UID: \"550cd3d62a888e29b98d67015acd7a34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.638428 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.638332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.638428 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.638349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.732145 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.732044 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.739439 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/550cd3d62a888e29b98d67015acd7a34-config\") pod \"kube-apiserver-proxy-ip-10-0-134-133.ec2.internal\" (UID: \"550cd3d62a888e29b98d67015acd7a34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.739533 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.739533 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.739533 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.739533 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/550cd3d62a888e29b98d67015acd7a34-config\") pod \"kube-apiserver-proxy-ip-10-0-134-133.ec2.internal\" (UID: \"550cd3d62a888e29b98d67015acd7a34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.739674 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.739541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8c882a91b17682408e1f8cbddf91ea4a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal\" (UID: \"8c882a91b17682408e1f8cbddf91ea4a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.832857 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.832815 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:06.905360 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.905322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.908853 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:06.908835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:06.933530 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:06.933491 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:07.034193 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.034107 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:07.134629 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.134594 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:07.235226 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.235196 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-133.ec2.internal\" not found" Apr 16 18:02:07.238458 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.238431 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:07.238598 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.238582 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:07.252215 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.252188 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:07.310985 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.310903 2577 apiserver.go:52] "Watching apiserver" Apr 16 18:02:07.320591 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.320286 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:07.322284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.321220 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-dfdlg","openshift-network-operator/iptables-alerter-spc8d","openshift-ovn-kubernetes/ovnkube-node-ptwtr","kube-system/konnectivity-agent-4spvw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478","openshift-cluster-node-tuning-operator/tuned-m8mjb","openshift-dns/node-resolver-qrflg","openshift-image-registry/node-ca-s2zfb","openshift-multus/multus-additional-cni-plugins-969sx","openshift-multus/multus-cqqlw","openshift-multus/network-metrics-daemon-znzwl"] Apr 16 18:02:07.322956 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.322932 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:07.323075 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.323050 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:07.324014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.323993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.325131 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.325100 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.326054 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.326032 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:07.326170 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.326054 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.326170 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.326160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xs8hp\"" Apr 16 18:02:07.326671 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.326647 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.327153 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:07.327233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327159 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.327233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327198 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.327340 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:07.327550 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.327626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327612 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cj8nh\"" Apr 16 18:02:07.327684 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327658 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.327887 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.327864 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:07.328063 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.328047 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:07.328800 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.328785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.329325 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329304 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.329426 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329340 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:07.329731 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329717 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-29h8d\"" Apr 16 18:02:07.329794 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329740 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:07.329884 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329868 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.329937 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:07.329990 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.329959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-2wzvj\"" Apr 16 18:02:07.330532 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.330507 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.330614 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.330509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.330670 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.330637 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.330754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.330741 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kbls5\"" Apr 16 18:02:07.331030 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.331002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.331970 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.331954 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.331970 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.331963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nqx78\"" Apr 16 18:02:07.332109 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.332004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.332357 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.332337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.332576 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.332556 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:07.332642 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.332628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gws24\"" Apr 16 18:02:07.333005 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.332984 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.333092 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.333014 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.333640 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.333620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.334333 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334316 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:07.334440 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334425 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:07.334530 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334514 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-r9z45\"" Apr 16 18:02:07.334750 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334727 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:07.334828 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.334883 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.334847 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:07.334990 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.334973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:07.335053 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.335036 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:07.335200 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.335184 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:07.335281 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.335253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:07.335335 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.335327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x6sn6\"" Apr 16 18:02:07.336438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.336339 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" Apr 16 18:02:07.339889 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.339867 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:07.342744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.342845 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.342845 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-etc-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.342845 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.342972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvwxg\" (UniqueName: \"kubernetes.io/projected/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-kube-api-access-qvwxg\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.342972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.342972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tgs\" (UniqueName: \"kubernetes.io/projected/24b24bc6-a399-4980-9de1-8258c56623b3-kube-api-access-j7tgs\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.342972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-netns\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.342980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-multus\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysconfig\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-run\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-etc-tuned\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-tmp\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-os-release\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.343114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-socket-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-sys-fs\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-bin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fknt\" (UniqueName: \"kubernetes.io/projected/f009e89a-5e15-4d47-81de-24ab98cb437b-kube-api-access-4fknt\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-var-lib-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-config\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrxb\" (UniqueName: \"kubernetes.io/projected/0897086a-3f20-4bf5-8811-04e196266bdf-kube-api-access-pkrxb\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-cnibin\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-hostroot\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nc7\" (UniqueName: \"kubernetes.io/projected/fe56bca4-2974-4d8d-a069-7f2e617e5495-kube-api-access-b2nc7\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343438 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-systemd\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343477 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-binary-copy\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-registration-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343530 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-device-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-os-release\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-cni-binary-copy\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-k8s-cni-cncf-io\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-node-log\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-var-lib-kubelet\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be8a29c6-c9c8-407b-9a79-1120ab614958-hosts-file\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n98d\" (UniqueName: \"kubernetes.io/projected/0476aa99-7b98-404d-a37a-dfae7eb89922-kube-api-access-6n98d\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.343816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-system-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-conf-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-daemon-config\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-multus-certs\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.343978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-conf\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-host\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-etc-selinux\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-kubelet\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-slash\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f0debde6-1ccc-484b-a994-63e26bc909b9-agent-certs\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-system-cni-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344193 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d777df23-9c77-4ee3-a1ad-07ef46670681-iptables-alerter-script\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-systemd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovn-node-metrics-cert\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-kubernetes\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rgx\" (UniqueName: \"kubernetes.io/projected/42f0f7c4-f605-4e8b-a431-64e78857571a-kube-api-access-56rgx\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4s4\" (UniqueName: \"kubernetes.io/projected/d777df23-9c77-4ee3-a1ad-07ef46670681-kube-api-access-2f4s4\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-systemd-units\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-netns\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-bin\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344419 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-sys\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-log-socket\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b24bc6-a399-4980-9de1-8258c56623b3-host\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-ovn\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-lib-modules\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24b24bc6-a399-4980-9de1-8258c56623b3-serviceca\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d777df23-9c77-4ee3-a1ad-07ef46670681-host-slash\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-cnibin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.344703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-etc-kubernetes\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-netd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344674 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f0debde6-1ccc-484b-a994-63e26bc909b9-konnectivity-ca\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be8a29c6-c9c8-407b-9a79-1120ab614958-tmp-dir\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6d8\" (UniqueName: \"kubernetes.io/projected/be8a29c6-c9c8-407b-9a79-1120ab614958-kube-api-access-tz6d8\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-env-overrides\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-modprobe-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-socket-dir-parent\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-kubelet\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.345211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.344831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-script-lib\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.350406 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.350383 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal"] Apr 16 18:02:07.351023 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.351005 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:07.351083 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.351073 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" Apr 16 18:02:07.352683 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.352662 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:07.364433 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.364409 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:07.364830 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.364811 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal"] Apr 16 18:02:07.370270 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.370248 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fzws2" Apr 16 18:02:07.379171 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.379142 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fzws2" Apr 16 18:02:07.411253 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.411217 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:57:06 +0000 UTC" deadline="2027-11-14 14:41:12.21416704 +0000 UTC" Apr 16 18:02:07.411253 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.411248 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13844h39m4.802921414s" Apr 16 18:02:07.412054 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.412024 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c882a91b17682408e1f8cbddf91ea4a.slice/crio-0a33b39b708f82a2c72ebacc9b556640aa5ebe50ecb504a38f3191c7f1171221 WatchSource:0}: Error finding container 0a33b39b708f82a2c72ebacc9b556640aa5ebe50ecb504a38f3191c7f1171221: Status 404 returned error can't find the container with id 0a33b39b708f82a2c72ebacc9b556640aa5ebe50ecb504a38f3191c7f1171221 Apr 16 18:02:07.412460 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.412429 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550cd3d62a888e29b98d67015acd7a34.slice/crio-39846ffe9c8089bad80647f51297c446094ffffe887286561d5b549589d09528 WatchSource:0}: Error finding container 39846ffe9c8089bad80647f51297c446094ffffe887286561d5b549589d09528: Status 404 returned error can't find the container with id 39846ffe9c8089bad80647f51297c446094ffffe887286561d5b549589d09528 Apr 16 18:02:07.417204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.417189 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:07.445166 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d777df23-9c77-4ee3-a1ad-07ef46670681-host-slash\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.445166 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-cnibin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.445402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-etc-kubernetes\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.445402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-netd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.445402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d777df23-9c77-4ee3-a1ad-07ef46670681-host-slash\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.445797 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f0debde6-1ccc-484b-a994-63e26bc909b9-konnectivity-ca\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.445962 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be8a29c6-c9c8-407b-9a79-1120ab614958-tmp-dir\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.446088 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-etc-kubernetes\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446088 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.445990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6d8\" (UniqueName: \"kubernetes.io/projected/be8a29c6-c9c8-407b-9a79-1120ab614958-kube-api-access-tz6d8\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.446088 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-env-overrides\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446088 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-modprobe-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.446258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-cnibin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446331 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446398 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-socket-dir-parent\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446485 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-kubelet\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446535 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-modprobe-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.446535 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446625 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-socket-dir-parent\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-env-overrides\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-netd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-kubelet\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.446849 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-script-lib\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446849 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.446849 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-etc-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvwxg\" (UniqueName: \"kubernetes.io/projected/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-kube-api-access-qvwxg\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tgs\" (UniqueName: \"kubernetes.io/projected/24b24bc6-a399-4980-9de1-8258c56623b3-kube-api-access-j7tgs\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.446982 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-netns\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be8a29c6-c9c8-407b-9a79-1120ab614958-tmp-dir\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.446997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-multus\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysconfig\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-run\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysconfig\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-etc-tuned\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-multus\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.447553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f0debde6-1ccc-484b-a994-63e26bc909b9-konnectivity-ca\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.447553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-run\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447553 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.447419 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:07.447553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-netns\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.447553 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.447510 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:07.947473147 +0000 UTC m=+2.060280600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:07.447811 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-tmp\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.447858 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.447858 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-os-release\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.447941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-socket-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.447941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-sys-fs\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.447941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-bin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.447941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fknt\" (UniqueName: \"kubernetes.io/projected/f009e89a-5e15-4d47-81de-24ab98cb437b-kube-api-access-4fknt\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-var-lib-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-config\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrxb\" (UniqueName: \"kubernetes.io/projected/0897086a-3f20-4bf5-8811-04e196266bdf-kube-api-access-pkrxb\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.447994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-script-lib\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-cnibin\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-cnibin\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-hostroot\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.448107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nc7\" (UniqueName: \"kubernetes.io/projected/fe56bca4-2974-4d8d-a069-7f2e617e5495-kube-api-access-b2nc7\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-systemd\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-binary-copy\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-registration-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-device-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-os-release\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-cni-binary-copy\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448321 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-k8s-cni-cncf-io\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-var-lib-cni-bin\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.448502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-os-release\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.449004 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-socket-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.449004 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-sys-fs\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.449004 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-etc-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.449004 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-var-lib-openvswitch\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.449200 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-systemd\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.449245 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovnkube-config\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.449294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-hostroot\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.449335 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-device-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.449525 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.449594 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-os-release\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.449678 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-registration-dir\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.449914 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.449892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-binary-copy\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.450296 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-cni-binary-copy\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.450344 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.448330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-k8s-cni-cncf-io\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.450780 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:07.450824 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.450824 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-node-log\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.450897 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-node-log\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.450897 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-ovn-kubernetes\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.450989 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-var-lib-kubelet\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.450989 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be8a29c6-c9c8-407b-9a79-1120ab614958-hosts-file\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.450989 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.450981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-var-lib-kubelet\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n98d\" (UniqueName: \"kubernetes.io/projected/0476aa99-7b98-404d-a37a-dfae7eb89922-kube-api-access-6n98d\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be8a29c6-c9c8-407b-9a79-1120ab614958-hosts-file\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-system-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-conf-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-daemon-config\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-system-cni-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-multus-certs\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-conf-dir\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-conf\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fe56bca4-2974-4d8d-a069-7f2e617e5495-host-run-multus-certs\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-conf\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-host\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-etc-selinux\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-kubelet\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-slash\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-host\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-slash\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476aa99-7b98-404d-a37a-dfae7eb89922-etc-selinux\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f0debde6-1ccc-484b-a994-63e26bc909b9-agent-certs\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.451489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-kubelet\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-system-cni-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d777df23-9c77-4ee3-a1ad-07ef46670681-iptables-alerter-script\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fe56bca4-2974-4d8d-a069-7f2e617e5495-multus-daemon-config\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/42f0f7c4-f605-4e8b-a431-64e78857571a-system-cni-dir\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-systemd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovn-node-metrics-cert\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-kubernetes\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56rgx\" (UniqueName: \"kubernetes.io/projected/42f0f7c4-f605-4e8b-a431-64e78857571a-kube-api-access-56rgx\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4s4\" (UniqueName: \"kubernetes.io/projected/d777df23-9c77-4ee3-a1ad-07ef46670681-kube-api-access-2f4s4\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-etc-tuned\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0897086a-3f20-4bf5-8811-04e196266bdf-tmp\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-systemd-units\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-netns\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-systemd\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-systemd-units\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.451980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451977 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-bin\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.451937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-cni-bin\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452019 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-kubernetes\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-sys\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-log-socket\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-etc-sysctl-d\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b24bc6-a399-4980-9de1-8258c56623b3-host\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24b24bc6-a399-4980-9de1-8258c56623b3-host\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d777df23-9c77-4ee3-a1ad-07ef46670681-iptables-alerter-script\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-ovn\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-host-run-netns\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-log-socket\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-lib-modules\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-sys\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-run-ovn\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.452621 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24b24bc6-a399-4980-9de1-8258c56623b3-serviceca\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.453105 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0897086a-3f20-4bf5-8811-04e196266bdf-lib-modules\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.453105 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452699 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24b24bc6-a399-4980-9de1-8258c56623b3-serviceca\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.453105 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.452720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/42f0f7c4-f605-4e8b-a431-64e78857571a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.453931 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.453911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-ovn-node-metrics-cert\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.454536 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.454516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f0debde6-1ccc-484b-a994-63e26bc909b9-agent-certs\") pod \"konnectivity-agent-4spvw\" (UID: \"f0debde6-1ccc-484b-a994-63e26bc909b9\") " pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.463586 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.463555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrxb\" (UniqueName: \"kubernetes.io/projected/0897086a-3f20-4bf5-8811-04e196266bdf-kube-api-access-pkrxb\") pod \"tuned-m8mjb\" (UID: \"0897086a-3f20-4bf5-8811-04e196266bdf\") " pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.466181 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.466160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nc7\" (UniqueName: \"kubernetes.io/projected/fe56bca4-2974-4d8d-a069-7f2e617e5495-kube-api-access-b2nc7\") pod \"multus-cqqlw\" (UID: \"fe56bca4-2974-4d8d-a069-7f2e617e5495\") " pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.467908 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.467889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tgs\" (UniqueName: \"kubernetes.io/projected/24b24bc6-a399-4980-9de1-8258c56623b3-kube-api-access-j7tgs\") pod \"node-ca-s2zfb\" (UID: \"24b24bc6-a399-4980-9de1-8258c56623b3\") " pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.468596 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.468574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fknt\" (UniqueName: \"kubernetes.io/projected/f009e89a-5e15-4d47-81de-24ab98cb437b-kube-api-access-4fknt\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.475142 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.475098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" event={"ID":"8c882a91b17682408e1f8cbddf91ea4a","Type":"ContainerStarted","Data":"0a33b39b708f82a2c72ebacc9b556640aa5ebe50ecb504a38f3191c7f1171221"} Apr 16 18:02:07.476015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.475994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" event={"ID":"550cd3d62a888e29b98d67015acd7a34","Type":"ContainerStarted","Data":"39846ffe9c8089bad80647f51297c446094ffffe887286561d5b549589d09528"} Apr 16 18:02:07.477175 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.477160 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:07.477223 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.477178 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:07.477223 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.477189 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:07.477294 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.477252 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:07.977236342 +0000 UTC m=+2.090043789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:07.481511 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.481491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4s4\" (UniqueName: \"kubernetes.io/projected/d777df23-9c77-4ee3-a1ad-07ef46670681-kube-api-access-2f4s4\") pod \"iptables-alerter-spc8d\" (UID: \"d777df23-9c77-4ee3-a1ad-07ef46670681\") " pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.481663 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.481647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6d8\" (UniqueName: \"kubernetes.io/projected/be8a29c6-c9c8-407b-9a79-1120ab614958-kube-api-access-tz6d8\") pod \"node-resolver-qrflg\" (UID: \"be8a29c6-c9c8-407b-9a79-1120ab614958\") " pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.485059 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.485040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvwxg\" (UniqueName: \"kubernetes.io/projected/0041c0cb-37ba-4e2d-8ab3-73fe90eb40df-kube-api-access-qvwxg\") pod \"ovnkube-node-ptwtr\" (UID: \"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df\") " pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.486976 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.486960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n98d\" (UniqueName: \"kubernetes.io/projected/0476aa99-7b98-404d-a37a-dfae7eb89922-kube-api-access-6n98d\") pod \"aws-ebs-csi-driver-node-mt478\" (UID: \"0476aa99-7b98-404d-a37a-dfae7eb89922\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.488131 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.488115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rgx\" (UniqueName: \"kubernetes.io/projected/42f0f7c4-f605-4e8b-a431-64e78857571a-kube-api-access-56rgx\") pod \"multus-additional-cni-plugins-969sx\" (UID: \"42f0f7c4-f605-4e8b-a431-64e78857571a\") " pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.523592 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.523565 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:07.654821 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.654727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spc8d" Apr 16 18:02:07.661263 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.661238 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd777df23_9c77_4ee3_a1ad_07ef46670681.slice/crio-de83fdecfc13bd458b2d1c11b36c264b9d235577faccd6ae6ae0f29806653f35 WatchSource:0}: Error finding container de83fdecfc13bd458b2d1c11b36c264b9d235577faccd6ae6ae0f29806653f35: Status 404 returned error can't find the container with id de83fdecfc13bd458b2d1c11b36c264b9d235577faccd6ae6ae0f29806653f35 Apr 16 18:02:07.667929 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.667908 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:07.674831 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.674802 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0041c0cb_37ba_4e2d_8ab3_73fe90eb40df.slice/crio-5564949f6715da66f9b60295bab82142997c63d346cdf059e30d40d35adc588a WatchSource:0}: Error finding container 5564949f6715da66f9b60295bab82142997c63d346cdf059e30d40d35adc588a: Status 404 returned error can't find the container with id 5564949f6715da66f9b60295bab82142997c63d346cdf059e30d40d35adc588a Apr 16 18:02:07.685407 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.685361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:07.690179 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.690148 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" Apr 16 18:02:07.694186 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.694150 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0debde6_1ccc_484b_a994_63e26bc909b9.slice/crio-7fdce9215b30c7aca21b20392c4c6d20388caef6c4a9475ecf0e58d13311c036 WatchSource:0}: Error finding container 7fdce9215b30c7aca21b20392c4c6d20388caef6c4a9475ecf0e58d13311c036: Status 404 returned error can't find the container with id 7fdce9215b30c7aca21b20392c4c6d20388caef6c4a9475ecf0e58d13311c036 Apr 16 18:02:07.697999 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.697969 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0476aa99_7b98_404d_a37a_dfae7eb89922.slice/crio-1e428aebd6fdd9db88d0bcf62a882eaebdc90ef5639376e1ec440e92ef27e76c WatchSource:0}: Error finding container 1e428aebd6fdd9db88d0bcf62a882eaebdc90ef5639376e1ec440e92ef27e76c: Status 404 returned error can't find the container with id 1e428aebd6fdd9db88d0bcf62a882eaebdc90ef5639376e1ec440e92ef27e76c Apr 16 18:02:07.707298 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.707273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" Apr 16 18:02:07.713767 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.713735 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0897086a_3f20_4bf5_8811_04e196266bdf.slice/crio-55679230e1b6b0bb48c1eae7166a9c16f2d435731f0fc52543aa3f39ad591919 WatchSource:0}: Error finding container 55679230e1b6b0bb48c1eae7166a9c16f2d435731f0fc52543aa3f39ad591919: Status 404 returned error can't find the container with id 55679230e1b6b0bb48c1eae7166a9c16f2d435731f0fc52543aa3f39ad591919 Apr 16 18:02:07.718345 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.718321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qrflg" Apr 16 18:02:07.724346 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.724321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-s2zfb" Apr 16 18:02:07.724531 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.724506 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8a29c6_c9c8_407b_9a79_1120ab614958.slice/crio-a6c97667b8e5a5a94826c1c8201bed41942fc7f58b07d70bdc296221d101674e WatchSource:0}: Error finding container a6c97667b8e5a5a94826c1c8201bed41942fc7f58b07d70bdc296221d101674e: Status 404 returned error can't find the container with id a6c97667b8e5a5a94826c1c8201bed41942fc7f58b07d70bdc296221d101674e Apr 16 18:02:07.730073 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.730052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-969sx" Apr 16 18:02:07.730239 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.730216 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b24bc6_a399_4980_9de1_8258c56623b3.slice/crio-f7b15bb1caedd9e28b3f76b44ae0180d854ca77d681f9ea4bc02ebefd50e2a59 WatchSource:0}: Error finding container f7b15bb1caedd9e28b3f76b44ae0180d854ca77d681f9ea4bc02ebefd50e2a59: Status 404 returned error can't find the container with id f7b15bb1caedd9e28b3f76b44ae0180d854ca77d681f9ea4bc02ebefd50e2a59 Apr 16 18:02:07.735655 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.735635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqqlw" Apr 16 18:02:07.735915 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.735896 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f0f7c4_f605_4e8b_a431_64e78857571a.slice/crio-d09205c47a6918ec18cd8fe72c68e4009c9f9176a40d5ce99390118e943eac4e WatchSource:0}: Error finding container d09205c47a6918ec18cd8fe72c68e4009c9f9176a40d5ce99390118e943eac4e: Status 404 returned error can't find the container with id d09205c47a6918ec18cd8fe72c68e4009c9f9176a40d5ce99390118e943eac4e Apr 16 18:02:07.742986 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:07.742961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe56bca4_2974_4d8d_a069_7f2e617e5495.slice/crio-f5ba78e59cff8ea9bcee1117bac51cd6233aa435009e49061202863a028aec30 WatchSource:0}: Error finding container f5ba78e59cff8ea9bcee1117bac51cd6233aa435009e49061202863a028aec30: Status 404 returned error can't find the container with id f5ba78e59cff8ea9bcee1117bac51cd6233aa435009e49061202863a028aec30 Apr 16 18:02:07.745682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.745662 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:07.956734 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:07.956641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:07.956895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.956810 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:07.956895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:07.956888 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:08.956868447 +0000 UTC m=+3.069675879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:08.057543 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.057505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:08.057704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.057661 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:08.057704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.057679 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:08.057704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.057692 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:08.057882 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.057783 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:09.057763449 +0000 UTC m=+3.170570884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:08.380082 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.379991 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:07 +0000 UTC" deadline="2027-09-25 00:04:30.823495549 +0000 UTC" Apr 16 18:02:08.380082 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.380026 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12630h2m22.443473722s" Apr 16 18:02:08.503083 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.503038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" event={"ID":"0897086a-3f20-4bf5-8811-04e196266bdf","Type":"ContainerStarted","Data":"55679230e1b6b0bb48c1eae7166a9c16f2d435731f0fc52543aa3f39ad591919"} Apr 16 18:02:08.516131 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.516088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" event={"ID":"0476aa99-7b98-404d-a37a-dfae7eb89922","Type":"ContainerStarted","Data":"1e428aebd6fdd9db88d0bcf62a882eaebdc90ef5639376e1ec440e92ef27e76c"} Apr 16 18:02:08.517554 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.517521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4spvw" event={"ID":"f0debde6-1ccc-484b-a994-63e26bc909b9","Type":"ContainerStarted","Data":"7fdce9215b30c7aca21b20392c4c6d20388caef6c4a9475ecf0e58d13311c036"} Apr 16 18:02:08.519026 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.518994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qrflg" event={"ID":"be8a29c6-c9c8-407b-9a79-1120ab614958","Type":"ContainerStarted","Data":"a6c97667b8e5a5a94826c1c8201bed41942fc7f58b07d70bdc296221d101674e"} Apr 16 18:02:08.528287 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.528214 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"5564949f6715da66f9b60295bab82142997c63d346cdf059e30d40d35adc588a"} Apr 16 18:02:08.535019 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.534954 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spc8d" event={"ID":"d777df23-9c77-4ee3-a1ad-07ef46670681","Type":"ContainerStarted","Data":"de83fdecfc13bd458b2d1c11b36c264b9d235577faccd6ae6ae0f29806653f35"} Apr 16 18:02:08.552883 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.552814 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqqlw" event={"ID":"fe56bca4-2974-4d8d-a069-7f2e617e5495","Type":"ContainerStarted","Data":"f5ba78e59cff8ea9bcee1117bac51cd6233aa435009e49061202863a028aec30"} Apr 16 18:02:08.558774 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.558701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerStarted","Data":"d09205c47a6918ec18cd8fe72c68e4009c9f9176a40d5ce99390118e943eac4e"} Apr 16 18:02:08.562117 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.562053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2zfb" event={"ID":"24b24bc6-a399-4980-9de1-8258c56623b3","Type":"ContainerStarted","Data":"f7b15bb1caedd9e28b3f76b44ae0180d854ca77d681f9ea4bc02ebefd50e2a59"} Apr 16 18:02:08.653413 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.653320 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:08.964448 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:08.964346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:08.964631 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.964584 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:08.964692 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:08.964650 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:10.964631987 +0000 UTC m=+5.077439436 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:09.065831 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:09.065793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:09.066575 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.066023 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:09.066575 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.066048 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:09.066575 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.066062 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:09.066575 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.066121 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:11.066102949 +0000 UTC m=+5.178910421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:09.381284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:09.381187 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:07 +0000 UTC" deadline="2027-11-30 16:32:49.988441363 +0000 UTC" Apr 16 18:02:09.381284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:09.381229 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14230h30m40.607216367s" Apr 16 18:02:09.474250 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:09.473522 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:09.474250 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.473664 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:09.474250 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:09.474096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:09.474250 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:09.474183 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:10.456475 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.456445 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bsftx"] Apr 16 18:02:10.459355 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.459330 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.459503 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:10.459423 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:10.481144 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.480870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-kubelet-config\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.481144 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.480920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-dbus\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.481144 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.481003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.581883 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.581848 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.582046 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.581914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-kubelet-config\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.582046 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.581943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-dbus\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.582119 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.582108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-dbus\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.582231 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:10.582214 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:10.582291 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:10.582279 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:11.082260865 +0000 UTC m=+5.195068298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:10.582557 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.582535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/842ea51c-5928-423a-9820-b4041ccdbe7b-kubelet-config\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:10.985211 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:10.985173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:10.985410 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:10.985337 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:10.985471 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:10.985421 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:14.985400465 +0000 UTC m=+9.098207905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:11.085754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:11.085702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:11.085754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:11.085751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:11.085993 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.085938 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:11.085993 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.085960 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:11.085993 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.085968 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:11.086141 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.086046 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:12.086025775 +0000 UTC m=+6.198833210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:11.086141 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.085974 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:11.086141 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.086111 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:15.086093351 +0000 UTC m=+9.198900787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:11.472637 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:11.472604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:11.473075 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:11.472611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:11.473075 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.472746 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:11.473075 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:11.472833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:12.094803 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:12.094763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:12.095002 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:12.094936 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:12.095071 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:12.095004 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:14.094985154 +0000 UTC m=+8.207792590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:12.473645 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:12.473430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:12.473645 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:12.473574 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:13.473218 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:13.473186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:13.473408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:13.473186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:13.473408 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:13.473339 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:13.473538 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:13.473435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:14.114753 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:14.114704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:14.115223 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:14.114870 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:14.115223 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:14.114936 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:18.114918667 +0000 UTC m=+12.227726100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:14.473081 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:14.472962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:14.473230 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:14.473100 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:15.023860 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:15.023823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:15.024054 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.024014 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:15.024110 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.024077 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:23.02405812 +0000 UTC m=+17.136865555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:15.125360 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:15.125243 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:15.125781 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.125433 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:15.125781 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.125452 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:15.125781 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.125465 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:15.125781 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.125524 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:23.125506366 +0000 UTC m=+17.238313802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:15.473576 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:15.473506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:15.473759 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.473642 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:15.474071 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:15.474052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:15.474186 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:15.474166 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:16.474947 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:16.474900 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:16.475412 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:16.475061 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:17.472687 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:17.472651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:17.472867 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:17.472651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:17.472867 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:17.472802 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:17.472969 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:17.472886 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:18.151059 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:18.151020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:18.151477 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:18.151164 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:18.151477 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:18.151219 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:26.151204894 +0000 UTC m=+20.264012332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:18.473428 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:18.473322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:18.473566 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:18.473469 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:19.473075 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:19.473039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:19.473556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:19.473041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:19.473556 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:19.473169 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:19.473556 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:19.473282 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:20.473337 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:20.473301 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:20.473796 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:20.473458 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:21.472878 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:21.472845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:21.473132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:21.472911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:21.473132 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:21.473010 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:21.473216 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:21.473150 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:22.472728 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:22.472697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:22.473118 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:22.472824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:23.090893 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:23.090854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:23.091127 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.090988 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:23.091127 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.091052 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:39.091036744 +0000 UTC m=+33.203844191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:23.191908 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:23.191871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:23.192100 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.192017 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:23.192100 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.192037 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:23.192100 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.192049 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:23.192242 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.192108 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:39.192088714 +0000 UTC m=+33.304896160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:23.473260 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:23.473180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:23.473701 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:23.473179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:23.473701 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.473318 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:23.473701 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:23.473399 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:24.473140 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:24.473108 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:24.473319 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:24.473239 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:25.472658 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:25.472622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:25.472799 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:25.472725 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:25.472799 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:25.472781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:25.472886 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:25.472869 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:26.213746 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.213307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:26.214285 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:26.213463 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:26.214285 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:26.213870 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret podName:842ea51c-5928-423a-9820-b4041ccdbe7b nodeName:}" failed. No retries permitted until 2026-04-16 18:02:42.2138516 +0000 UTC m=+36.326659049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret") pod "global-pull-secret-syncer-bsftx" (UID: "842ea51c-5928-423a-9820-b4041ccdbe7b") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:26.472781 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.472714 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:26.476490 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:26.473737 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:26.596176 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.595963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" event={"ID":"550cd3d62a888e29b98d67015acd7a34","Type":"ContainerStarted","Data":"dc32cea8f54672898cceac9b645309add2892236234d76da6625d44b62e072fc"} Apr 16 18:02:26.597301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.597278 2577 generic.go:358] "Generic (PLEG): container finished" podID="8c882a91b17682408e1f8cbddf91ea4a" containerID="20ee22cd0b2fb7b3a874c08f7625ca0f6501e15c7bca90625b20350206033326" exitCode=0 Apr 16 18:02:26.597414 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.597348 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" event={"ID":"8c882a91b17682408e1f8cbddf91ea4a","Type":"ContainerDied","Data":"20ee22cd0b2fb7b3a874c08f7625ca0f6501e15c7bca90625b20350206033326"} Apr 16 18:02:26.598685 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.598644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqqlw" event={"ID":"fe56bca4-2974-4d8d-a069-7f2e617e5495","Type":"ContainerStarted","Data":"e1fb7b1c6b7467049904ce861ecc002444fa9f8ddc3a7c129522d541c8eb83db"} Apr 16 18:02:26.600031 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.600010 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="9711dad2396eba9b3084474afdbf18047f342e61e0bec2bd8494690c502c0d71" exitCode=0 Apr 16 18:02:26.600112 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.600065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"9711dad2396eba9b3084474afdbf18047f342e61e0bec2bd8494690c502c0d71"} Apr 16 18:02:26.601454 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.601342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-s2zfb" event={"ID":"24b24bc6-a399-4980-9de1-8258c56623b3","Type":"ContainerStarted","Data":"adf77a01f57adf5ed109cc1fb27eabdb82b9e2252dc60f50d9fc16402f48bea3"} Apr 16 18:02:26.602512 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.602489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" event={"ID":"0897086a-3f20-4bf5-8811-04e196266bdf","Type":"ContainerStarted","Data":"2b8c88165c52552bd4995ca6587e810aa55ce94e9111e0d8f72c58c3d2fb2707"} Apr 16 18:02:26.603628 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.603601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" event={"ID":"0476aa99-7b98-404d-a37a-dfae7eb89922","Type":"ContainerStarted","Data":"ac85149421bbc405cca7ee8014b6af2278803ab19669a94378dac8a4b0e990aa"} Apr 16 18:02:26.604853 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.604772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4spvw" event={"ID":"f0debde6-1ccc-484b-a994-63e26bc909b9","Type":"ContainerStarted","Data":"03136107de67f9a206cfe32144011a9fed87a69c819901004f57168e3863374c"} Apr 16 18:02:26.606062 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.606041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qrflg" event={"ID":"be8a29c6-c9c8-407b-9a79-1120ab614958","Type":"ContainerStarted","Data":"9fce107539bc5db9fa922f995798fe9d369da6cc461c96257c477f0ab4b3d359"} Apr 16 18:02:26.608320 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:02:26.608664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608644 2577 generic.go:358] "Generic (PLEG): container finished" podID="0041c0cb-37ba-4e2d-8ab3-73fe90eb40df" containerID="5773b990baf8368c28e31f67bf34e72a473e4bf8be8d2ec06f5cbff75994d969" exitCode=1 Apr 16 18:02:26.608742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608680 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"0f3708cdd4e480fbe69af727528d88aeec6e674130e484fd15415845b9ac2f9e"} Apr 16 18:02:26.608742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"40c102e200f28d1f6cc0fcbbbd264c1cd810767f85ecee918bff7cd39bf3ece4"} Apr 16 18:02:26.608742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"b7d1c2d314b423b307261e631af3fb48a5e47c6d51ca906f9c32fadf116b5565"} Apr 16 18:02:26.608742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"07086889e3a89daa35283b628a70ca5ec2fa13e2ac7b4da1e67c74a07c4a802d"} Apr 16 18:02:26.608742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerDied","Data":"5773b990baf8368c28e31f67bf34e72a473e4bf8be8d2ec06f5cbff75994d969"} Apr 16 18:02:26.608893 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.608750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"eeb62f74828352875c280fdd45477b3dee419b0afe752d3284a7b4a0d2434b4f"} Apr 16 18:02:26.622264 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.622206 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cqqlw" podStartSLOduration=2.489499032 podStartE2EDuration="20.622186586s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.744521745 +0000 UTC m=+1.857329178" lastFinishedPulling="2026-04-16 18:02:25.877209299 +0000 UTC m=+19.990016732" observedRunningTime="2026-04-16 18:02:26.6220106 +0000 UTC m=+20.734818058" watchObservedRunningTime="2026-04-16 18:02:26.622186586 +0000 UTC m=+20.734994042" Apr 16 18:02:26.622707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.622672 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-133.ec2.internal" podStartSLOduration=19.622659232 podStartE2EDuration="19.622659232s" podCreationTimestamp="2026-04-16 18:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:26.608998392 +0000 UTC m=+20.721805847" watchObservedRunningTime="2026-04-16 18:02:26.622659232 +0000 UTC m=+20.735466687" Apr 16 18:02:26.635346 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.635284 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4spvw" podStartSLOduration=3.018076006 podStartE2EDuration="20.635266861s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.696044959 +0000 UTC m=+1.808852391" lastFinishedPulling="2026-04-16 18:02:25.313235798 +0000 UTC m=+19.426043246" observedRunningTime="2026-04-16 18:02:26.63501324 +0000 UTC m=+20.747820695" watchObservedRunningTime="2026-04-16 18:02:26.635266861 +0000 UTC m=+20.748074332" Apr 16 18:02:26.648433 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.648388 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-m8mjb" podStartSLOduration=2.7222002549999997 podStartE2EDuration="20.648360215s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.714996082 +0000 UTC m=+1.827803513" lastFinishedPulling="2026-04-16 18:02:25.641156036 +0000 UTC m=+19.753963473" observedRunningTime="2026-04-16 18:02:26.647679532 +0000 UTC m=+20.760486986" watchObservedRunningTime="2026-04-16 18:02:26.648360215 +0000 UTC m=+20.761167669" Apr 16 18:02:26.712945 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:26.712898 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qrflg" podStartSLOduration=2.79780979 podStartE2EDuration="20.712882874s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.72710918 +0000 UTC m=+1.839916612" lastFinishedPulling="2026-04-16 18:02:25.642182249 +0000 UTC m=+19.754989696" observedRunningTime="2026-04-16 18:02:26.712287478 +0000 UTC m=+20.825094932" watchObservedRunningTime="2026-04-16 18:02:26.712882874 +0000 UTC m=+20.825690327" Apr 16 18:02:27.384197 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.384160 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:02:27.415165 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.415050 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:02:27.384181179Z","UUID":"92d0a2ab-5801-483d-a4c6-4a5288f93fde","Handler":null,"Name":"","Endpoint":""} Apr 16 18:02:27.417855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.417834 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:02:27.417981 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.417864 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:02:27.472670 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.472636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:27.472834 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.472648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:27.472834 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:27.472736 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:27.472915 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:27.472833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:27.612142 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.612109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" event={"ID":"8c882a91b17682408e1f8cbddf91ea4a","Type":"ContainerStarted","Data":"7154df6be388a4a101bd0a91a70e7b5aa120b4a07e4c85964d0b7fd67e85f26b"} Apr 16 18:02:27.613901 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.613871 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" event={"ID":"0476aa99-7b98-404d-a37a-dfae7eb89922","Type":"ContainerStarted","Data":"6cb48864b7f4e0d2bfb5f5b81e0c499ee7d6a3ba7d5144dc56ebfeff24a9f8c1"} Apr 16 18:02:27.615134 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.615111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spc8d" event={"ID":"d777df23-9c77-4ee3-a1ad-07ef46670681","Type":"ContainerStarted","Data":"93c8fddcac04f760920b8d0cdbc13f3c02b7a959b8b701045a48f99e52a2e911"} Apr 16 18:02:27.637875 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.637830 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-s2zfb" podStartSLOduration=3.763662728 podStartE2EDuration="21.637813538s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.73276367 +0000 UTC m=+1.845571102" lastFinishedPulling="2026-04-16 18:02:25.606914465 +0000 UTC m=+19.719721912" observedRunningTime="2026-04-16 18:02:26.726500058 +0000 UTC m=+20.839307504" watchObservedRunningTime="2026-04-16 18:02:27.637813538 +0000 UTC m=+21.750620973" Apr 16 18:02:27.653005 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.652952 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-spc8d" podStartSLOduration=4.002551547 podStartE2EDuration="21.652938571s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.662846327 +0000 UTC m=+1.775653758" lastFinishedPulling="2026-04-16 18:02:25.313233349 +0000 UTC m=+19.426040782" observedRunningTime="2026-04-16 18:02:27.652785534 +0000 UTC m=+21.765592988" watchObservedRunningTime="2026-04-16 18:02:27.652938571 +0000 UTC m=+21.765746025" Apr 16 18:02:27.653197 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:27.653171 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-133.ec2.internal" podStartSLOduration=20.653166666 podStartE2EDuration="20.653166666s" podCreationTimestamp="2026-04-16 18:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:27.638445622 +0000 UTC m=+21.751253076" watchObservedRunningTime="2026-04-16 18:02:27.653166666 +0000 UTC m=+21.765974120" Apr 16 18:02:28.472620 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:28.472590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:28.473148 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:28.472700 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:28.619302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:28.619264 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" event={"ID":"0476aa99-7b98-404d-a37a-dfae7eb89922","Type":"ContainerStarted","Data":"7a25813437b93c4aa90a0d2d3fe14733b81f9848bb4aaec17352b715c8ddc79d"} Apr 16 18:02:28.622618 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:28.622591 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:02:28.623014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:28.622989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"4e49c8f131db01fa32b6f574355a3913c3cbe2254714ad4624a3d2ad6ede3020"} Apr 16 18:02:28.635604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:28.635539 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mt478" podStartSLOduration=1.9758429560000002 podStartE2EDuration="22.635525686s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.699529092 +0000 UTC m=+1.812336524" lastFinishedPulling="2026-04-16 18:02:28.35921182 +0000 UTC m=+22.472019254" observedRunningTime="2026-04-16 18:02:28.635281836 +0000 UTC m=+22.748089289" watchObservedRunningTime="2026-04-16 18:02:28.635525686 +0000 UTC m=+22.748333141" Apr 16 18:02:29.473089 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:29.473050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:29.473567 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:29.473050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:29.473567 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:29.473190 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:29.473567 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:29.473312 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:29.731863 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:29.731781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:29.732478 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:29.732458 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:30.472787 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:30.472758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:30.472937 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:30.472880 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:30.630449 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:30.630213 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:02:30.632151 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:30.632127 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:30.632634 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:30.632612 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4spvw" Apr 16 18:02:31.472472 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.472439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:31.472627 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.472439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:31.472627 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:31.472550 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:31.472627 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:31.472619 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:31.635175 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.635142 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="2d272741fe0faa61ca05efa76d93264268275d615e6a67eea3343bdc9e5cbca6" exitCode=0 Apr 16 18:02:31.636018 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.635207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"2d272741fe0faa61ca05efa76d93264268275d615e6a67eea3343bdc9e5cbca6"} Apr 16 18:02:31.640063 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.640047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:02:31.640431 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.640393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"d8d75db8c4d2994d9bb0ef3bd5481e0f6e8c88b6437bb0b8e415f268ae84c317"} Apr 16 18:02:31.640842 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.640821 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:31.640919 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.640848 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:31.640969 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.640929 2577 scope.go:117] "RemoveContainer" containerID="5773b990baf8368c28e31f67bf34e72a473e4bf8be8d2ec06f5cbff75994d969" Apr 16 18:02:31.656280 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:31.656257 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:32.473668 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.473485 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:32.473826 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:32.473763 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:32.557321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.557232 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bsftx"] Apr 16 18:02:32.576692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.576659 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dfdlg"] Apr 16 18:02:32.576826 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.576743 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:32.576884 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:32.576818 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:32.582393 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.581822 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-znzwl"] Apr 16 18:02:32.582393 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.581955 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:32.582393 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:32.582154 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:32.644058 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.644027 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="ce508175b1100d1108d0334d43fc10f84f952904e6b091ece439cb365f6a5b4f" exitCode=0 Apr 16 18:02:32.644570 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.644116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"ce508175b1100d1108d0334d43fc10f84f952904e6b091ece439cb365f6a5b4f"} Apr 16 18:02:32.647339 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.647321 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:02:32.647639 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.647618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" event={"ID":"0041c0cb-37ba-4e2d-8ab3-73fe90eb40df","Type":"ContainerStarted","Data":"e75cb70d1042efdd12c4f9ddbe37138481654fdd11a0c855e2b7bf2643361158"} Apr 16 18:02:32.647718 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.647670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:32.647806 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:32.647784 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:32.648037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.648019 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:32.662124 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.662098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:02:32.803978 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:32.803924 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" podStartSLOduration=8.79670147 podStartE2EDuration="26.803908762s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.676621935 +0000 UTC m=+1.789429373" lastFinishedPulling="2026-04-16 18:02:25.683829234 +0000 UTC m=+19.796636665" observedRunningTime="2026-04-16 18:02:32.803440789 +0000 UTC m=+26.916248243" watchObservedRunningTime="2026-04-16 18:02:32.803908762 +0000 UTC m=+26.916716213" Apr 16 18:02:33.651408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:33.651362 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="65b6f88956105ae8b7f1d86ed4e71f43cb58f05d764e20a60bec2600cf72bbc0" exitCode=0 Apr 16 18:02:33.651763 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:33.651412 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"65b6f88956105ae8b7f1d86ed4e71f43cb58f05d764e20a60bec2600cf72bbc0"} Apr 16 18:02:34.473208 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:34.473159 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:34.473208 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:34.473194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:34.473447 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:34.473354 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:34.474666 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:34.473714 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:34.474666 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:34.473790 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:34.474666 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:34.473908 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:36.474063 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:36.474030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:36.474523 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:36.474146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:36.474523 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:36.474173 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:36.474523 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:36.474225 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:36.474523 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:36.474255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:36.474523 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:36.474401 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:38.473111 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.473069 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:38.473111 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.473103 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:38.473630 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.473102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:38.473630 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:38.473219 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bsftx" podUID="842ea51c-5928-423a-9820-b4041ccdbe7b" Apr 16 18:02:38.473630 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:38.473411 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:02:38.473630 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:38.473503 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfdlg" podUID="eb29446e-bb65-416b-a40d-d985b58d7505" Apr 16 18:02:38.747061 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.747033 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-133.ec2.internal" event="NodeReady" Apr 16 18:02:38.747246 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.747177 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:02:38.788844 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.788813 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85dc8575d4-v85pb"] Apr 16 18:02:38.825766 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.825733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85dc8575d4-v85pb"] Apr 16 18:02:38.825766 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.825777 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-77v8t"] Apr 16 18:02:38.825993 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.825875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.828185 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.828156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:02:38.828185 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.828173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:02:38.829588 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.829543 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:02:38.829588 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.829576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jwfwh\"" Apr 16 18:02:38.841225 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.841195 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjlz"] Apr 16 18:02:38.841386 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.841354 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:38.844022 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.843991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:02:38.845825 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.845801 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:02:38.845825 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.845807 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:02:38.845992 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.845846 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-j5p9z\"" Apr 16 18:02:38.857239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.857207 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77v8t"] Apr 16 18:02:38.857239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.857246 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjlz"] Apr 16 18:02:38.857468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.857358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:38.862034 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.861995 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:02:38.862182 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.862045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:02:38.862252 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.862181 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dxq2x\"" Apr 16 18:02:38.862338 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.862320 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:02:38.914000 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.913963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdls6\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:38.914321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:38.914294 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015423 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.015423 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h2zn\" (UniqueName: \"kubernetes.io/projected/176aef22-2713-42bf-81d6-9602a79bf10f-kube-api-access-6h2zn\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.015626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6429bf79-1554-458a-8ed2-de631c73ca89-tmp-dir\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.015626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8wv8\" (UniqueName: \"kubernetes.io/projected/6429bf79-1554-458a-8ed2-de631c73ca89-kube-api-access-m8wv8\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdls6\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.015737 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.015754 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.015920 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.015807 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:39.515789718 +0000 UTC m=+33.628597185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:39.016234 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.016234 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.015997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.016234 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.016027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6429bf79-1554-458a-8ed2-de631c73ca89-config-volume\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.016394 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.016299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.016507 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.016452 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.019067 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.018947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.020352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.020331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.020928 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.020908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.030103 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.030070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdls6\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.032131 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.032101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.116638 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.116638 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6429bf79-1554-458a-8ed2-de631c73ca89-config-volume\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h2zn\" (UniqueName: \"kubernetes.io/projected/176aef22-2713-42bf-81d6-9602a79bf10f-kube-api-access-6h2zn\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116761 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6429bf79-1554-458a-8ed2-de631c73ca89-tmp-dir\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116780 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.116816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8wv8\" (UniqueName: \"kubernetes.io/projected/6429bf79-1554-458a-8ed2-de631c73ca89-kube-api-access-m8wv8\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116826 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116835 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:39.616814176 +0000 UTC m=+33.729621609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116884 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:39.616868391 +0000 UTC m=+33.729675833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:02:39.116895 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.116904 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:03:11.11689497 +0000 UTC m=+65.229702408 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:39.117256 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.117089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6429bf79-1554-458a-8ed2-de631c73ca89-tmp-dir\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.117256 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.117163 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6429bf79-1554-458a-8ed2-de631c73ca89-config-volume\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.129769 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.129738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h2zn\" (UniqueName: \"kubernetes.io/projected/176aef22-2713-42bf-81d6-9602a79bf10f-kube-api-access-6h2zn\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.131424 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.131397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8wv8\" (UniqueName: \"kubernetes.io/projected/6429bf79-1554-458a-8ed2-de631c73ca89-kube-api-access-m8wv8\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.218109 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.218073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:39.218269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.218240 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:39.218269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.218262 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:39.218349 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.218273 2577 projected.go:194] Error preparing data for projected volume kube-api-access-p44wv for pod openshift-network-diagnostics/network-check-target-dfdlg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:39.218427 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.218360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv podName:eb29446e-bb65-416b-a40d-d985b58d7505 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:11.218345193 +0000 UTC m=+65.331152631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p44wv" (UniqueName: "kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv") pod "network-check-target-dfdlg" (UID: "eb29446e-bb65-416b-a40d-d985b58d7505") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:39.520323 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.520284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:39.520748 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.520415 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:39.520748 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.520439 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:39.520748 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.520523 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.520501182 +0000 UTC m=+34.633308614 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:39.620833 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.620797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:39.621025 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.620845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:39.621025 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.620973 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:39.621025 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.621021 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.62100716 +0000 UTC m=+34.733814593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:02:39.621182 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.620971 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:39.621182 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:39.621049 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.621043465 +0000 UTC m=+34.733850896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:39.665774 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:39.665744 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerStarted","Data":"850506ff73a3292e754b62346e06667c54531006d6669b9ca80b07d103621d39"} Apr 16 18:02:40.472595 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.472545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:02:40.472595 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.472581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:40.472905 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.472616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:02:40.475360 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.475339 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:02:40.476382 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.476347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:02:40.476508 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.476349 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:02:40.476508 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.476422 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs95d\"" Apr 16 18:02:40.476508 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.476389 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt7dn\"" Apr 16 18:02:40.476508 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.476472 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:02:40.528467 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.528435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:40.528963 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.528587 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:40.528963 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.528605 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:40.528963 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.528657 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:42.528643231 +0000 UTC m=+36.641450664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:40.629472 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.629437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:40.629603 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.629537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:40.629603 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.629591 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:40.629704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.629614 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:40.629704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.629653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:42.629639016 +0000 UTC m=+36.742446448 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:02:40.629704 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:40.629667 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:42.629660983 +0000 UTC m=+36.742468415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:40.669777 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.669744 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="850506ff73a3292e754b62346e06667c54531006d6669b9ca80b07d103621d39" exitCode=0 Apr 16 18:02:40.669927 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:40.669792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"850506ff73a3292e754b62346e06667c54531006d6669b9ca80b07d103621d39"} Apr 16 18:02:41.673815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:41.673784 2577 generic.go:358] "Generic (PLEG): container finished" podID="42f0f7c4-f605-4e8b-a431-64e78857571a" containerID="866b564c91a05a72e6bacd78aa4a4efcc1f746f7cffab41618656ba97da93a5b" exitCode=0 Apr 16 18:02:41.674168 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:41.673832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerDied","Data":"866b564c91a05a72e6bacd78aa4a4efcc1f746f7cffab41618656ba97da93a5b"} Apr 16 18:02:42.243247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.243206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:42.245904 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.245880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/842ea51c-5928-423a-9820-b4041ccdbe7b-original-pull-secret\") pod \"global-pull-secret-syncer-bsftx\" (UID: \"842ea51c-5928-423a-9820-b4041ccdbe7b\") " pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:42.282201 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.282164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bsftx" Apr 16 18:02:42.468298 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.468110 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bsftx"] Apr 16 18:02:42.472802 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:02:42.472769 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842ea51c_5928_423a_9820_b4041ccdbe7b.slice/crio-3f858eed935de2c6fbc96d827fe39ee944416625a3611ce11def4798225898c1 WatchSource:0}: Error finding container 3f858eed935de2c6fbc96d827fe39ee944416625a3611ce11def4798225898c1: Status 404 returned error can't find the container with id 3f858eed935de2c6fbc96d827fe39ee944416625a3611ce11def4798225898c1 Apr 16 18:02:42.544972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.544935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:42.545151 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.545058 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:42.545151 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.545071 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:42.545151 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.545123 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:46.545108827 +0000 UTC m=+40.657916259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:42.646130 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.646020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:42.646130 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.646065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:42.646296 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.646170 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:42.646296 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.646172 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:42.646296 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.646224 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:46.646209622 +0000 UTC m=+40.759017054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:02:42.646296 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:42.646238 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:46.646230889 +0000 UTC m=+40.759038320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:42.680485 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.680441 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-969sx" event={"ID":"42f0f7c4-f605-4e8b-a431-64e78857571a","Type":"ContainerStarted","Data":"20d0d4a4ac74c1905c6d82e47d1bdd3edd4ffd7d62a6138ff699b7ee9f249839"} Apr 16 18:02:42.681478 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.681455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bsftx" event={"ID":"842ea51c-5928-423a-9820-b4041ccdbe7b","Type":"ContainerStarted","Data":"3f858eed935de2c6fbc96d827fe39ee944416625a3611ce11def4798225898c1"} Apr 16 18:02:42.733064 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:42.733014 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-969sx" podStartSLOduration=4.986845166 podStartE2EDuration="36.732997822s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:02:07.739106671 +0000 UTC m=+1.851914104" lastFinishedPulling="2026-04-16 18:02:39.485259315 +0000 UTC m=+33.598066760" observedRunningTime="2026-04-16 18:02:42.732905825 +0000 UTC m=+36.845713289" watchObservedRunningTime="2026-04-16 18:02:42.732997822 +0000 UTC m=+36.845805275" Apr 16 18:02:46.575668 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:46.575629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:46.576150 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.575826 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:46.576150 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.575849 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:46.576150 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.575920 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.575898165 +0000 UTC m=+48.688705597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:46.676037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:46.675992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:46.676210 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:46.676132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:46.676210 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.676163 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:46.676295 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.676222 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.676209035 +0000 UTC m=+48.789016467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:02:46.676295 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.676238 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:46.676295 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:46.676286 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:02:54.676271077 +0000 UTC m=+48.789078510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:47.693126 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:47.693088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bsftx" event={"ID":"842ea51c-5928-423a-9820-b4041ccdbe7b","Type":"ContainerStarted","Data":"ffeb7d7dc34e2347632bad67496e2da31e3b211ac16488fcec97ad7bc8c37050"} Apr 16 18:02:47.708827 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:47.708771 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bsftx" podStartSLOduration=33.044589874 podStartE2EDuration="37.708756786s" podCreationTimestamp="2026-04-16 18:02:10 +0000 UTC" firstStartedPulling="2026-04-16 18:02:42.474506908 +0000 UTC m=+36.587314340" lastFinishedPulling="2026-04-16 18:02:47.138673804 +0000 UTC m=+41.251481252" observedRunningTime="2026-04-16 18:02:47.708700617 +0000 UTC m=+41.821508071" watchObservedRunningTime="2026-04-16 18:02:47.708756786 +0000 UTC m=+41.821564240" Apr 16 18:02:54.631512 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:54.631471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:02:54.631967 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.631658 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:54.631967 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.631673 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:02:54.631967 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.631730 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.63171181 +0000 UTC m=+64.744519244 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:02:54.732321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:54.732284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:02:54.732321 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:02:54.732328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:02:54.732542 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.732448 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:54.732542 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.732481 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:54.732542 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.732527 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.732508767 +0000 UTC m=+64.845316201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:02:54.732542 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:02:54.732540 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:10.73253471 +0000 UTC m=+64.845342142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:03:04.663965 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:04.663940 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ptwtr" Apr 16 18:03:10.641571 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:10.641527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:03:10.642040 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.641642 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:10.642040 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.641653 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:03:10.642040 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.641735 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:42.641720274 +0000 UTC m=+96.754527706 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:03:10.742094 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:10.742061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:03:10.742094 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:10.742097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:03:10.742269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.742191 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:10.742269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.742197 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:10.742269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.742237 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:42.742225 +0000 UTC m=+96.855032432 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:03:10.742269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:10.742250 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:03:42.742244161 +0000 UTC m=+96.855051593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:03:11.144274 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.144240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:03:11.146681 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.146662 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:11.155328 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:11.155307 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:11.155406 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:11.155380 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:04:15.15535152 +0000 UTC m=+129.268158952 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : secret "metrics-daemon-secret" not found Apr 16 18:03:11.244755 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.244720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:03:11.247240 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.247221 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:11.257190 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.257170 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:11.269222 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.269201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44wv\" (UniqueName: \"kubernetes.io/projected/eb29446e-bb65-416b-a40d-d985b58d7505-kube-api-access-p44wv\") pod \"network-check-target-dfdlg\" (UID: \"eb29446e-bb65-416b-a40d-d985b58d7505\") " pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:03:11.390105 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.390076 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fs95d\"" Apr 16 18:03:11.397994 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.397947 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:03:11.524766 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.524738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dfdlg"] Apr 16 18:03:11.528862 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:03:11.528837 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb29446e_bb65_416b_a40d_d985b58d7505.slice/crio-ee5df4ad51b882f2cf08a6dc5bb5278bcad15d5014a49ed488a0b990e37f63aa WatchSource:0}: Error finding container ee5df4ad51b882f2cf08a6dc5bb5278bcad15d5014a49ed488a0b990e37f63aa: Status 404 returned error can't find the container with id ee5df4ad51b882f2cf08a6dc5bb5278bcad15d5014a49ed488a0b990e37f63aa Apr 16 18:03:11.739058 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:11.738971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dfdlg" event={"ID":"eb29446e-bb65-416b-a40d-d985b58d7505","Type":"ContainerStarted","Data":"ee5df4ad51b882f2cf08a6dc5bb5278bcad15d5014a49ed488a0b990e37f63aa"} Apr 16 18:03:14.747105 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:14.747071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dfdlg" event={"ID":"eb29446e-bb65-416b-a40d-d985b58d7505","Type":"ContainerStarted","Data":"6684c631f5fbde4e330717c3cef0ec8be1219853653fc5447abbb855b6732bea"} Apr 16 18:03:14.747481 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:14.747216 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:03:14.768524 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:14.768405 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-dfdlg" podStartSLOduration=66.170895754 podStartE2EDuration="1m8.768391555s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:03:11.530671088 +0000 UTC m=+65.643478520" lastFinishedPulling="2026-04-16 18:03:14.128166883 +0000 UTC m=+68.240974321" observedRunningTime="2026-04-16 18:03:14.767519118 +0000 UTC m=+68.880326566" watchObservedRunningTime="2026-04-16 18:03:14.768391555 +0000 UTC m=+68.881199007" Apr 16 18:03:42.670777 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:42.670739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:03:42.671254 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.670899 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:42.671254 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.670920 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:03:42.671254 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.670985 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:46.670964851 +0000 UTC m=+160.783772287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:03:42.771798 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:42.771760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:03:42.771798 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:42.771802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:03:42.771986 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.771906 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:42.771986 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.771959 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:46.771945198 +0000 UTC m=+160.884752630 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:03:42.771986 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.771906 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:42.772094 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:03:42.772044 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:46.772031365 +0000 UTC m=+160.884838798 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:03:45.751854 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:03:45.751822 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dfdlg" Apr 16 18:04:15.206328 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:15.206270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:04:15.206854 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:15.206432 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:04:15.206854 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:15.206505 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs podName:f009e89a-5e15-4d47-81de-24ab98cb437b nodeName:}" failed. No retries permitted until 2026-04-16 18:06:17.206488789 +0000 UTC m=+251.319296221 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs") pod "network-metrics-daemon-znzwl" (UID: "f009e89a-5e15-4d47-81de-24ab98cb437b") : secret "metrics-daemon-secret" not found Apr 16 18:04:39.733451 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.733419 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9"] Apr 16 18:04:39.736160 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.736143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.738341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.738303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.738341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.738319 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:04:39.738341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.738330 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:04:39.738604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.738346 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.738604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.738410 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-xc2n2\"" Apr 16 18:04:39.743971 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.743949 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9"] Apr 16 18:04:39.778644 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.778605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/646dbece-2434-4ec6-ad3e-0f3009cde6a3-serving-cert\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.778815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.778667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/646dbece-2434-4ec6-ad3e-0f3009cde6a3-config\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.778815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.778695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7gxv\" (UniqueName: \"kubernetes.io/projected/646dbece-2434-4ec6-ad3e-0f3009cde6a3-kube-api-access-n7gxv\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.828908 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.828879 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb"] Apr 16 18:04:39.831607 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.831590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" Apr 16 18:04:39.833628 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.833607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.833740 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.833612 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-92gqs\"" Apr 16 18:04:39.833740 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.833612 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.837446 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.837426 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-lbrtk"] Apr 16 18:04:39.840437 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.840416 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5"] Apr 16 18:04:39.840600 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.840568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.842559 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.842543 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9xdpp\"" Apr 16 18:04:39.843272 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.843249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:04:39.843383 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.843353 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb"] Apr 16 18:04:39.843444 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.843248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.843489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.843482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.844081 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.844060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.844363 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.844343 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:04:39.845524 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.845506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.845630 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.845542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:04:39.846918 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.846892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.847036 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.847017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4jhmt\"" Apr 16 18:04:39.849315 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.848864 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:04:39.851942 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.851903 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:04:39.854541 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.854516 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5"] Apr 16 18:04:39.855352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.855324 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-lbrtk"] Apr 16 18:04:39.879556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/646dbece-2434-4ec6-ad3e-0f3009cde6a3-config\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.879556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7gxv\" (UniqueName: \"kubernetes.io/projected/646dbece-2434-4ec6-ad3e-0f3009cde6a3-kube-api-access-n7gxv\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-tmp\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-747zh\" (UniqueName: \"kubernetes.io/projected/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-kube-api-access-747zh\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-snapshots\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/646dbece-2434-4ec6-ad3e-0f3009cde6a3-serving-cert\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtj2r\" (UniqueName: \"kubernetes.io/projected/bbff1a65-2590-46a2-b70c-3bc7271945eb-kube-api-access-dtj2r\") pod \"volume-data-source-validator-7d955d5dd4-5crmb\" (UID: \"bbff1a65-2590-46a2-b70c-3bc7271945eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.879809 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwz75\" (UniqueName: \"kubernetes.io/projected/879e1a6e-7abe-4b70-9fdd-76b30b854006-kube-api-access-jwz75\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.880104 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879836 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.880104 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.880104 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.880104 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.879922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/879e1a6e-7abe-4b70-9fdd-76b30b854006-serving-cert\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.880223 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.880116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/646dbece-2434-4ec6-ad3e-0f3009cde6a3-config\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.881840 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.881812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/646dbece-2434-4ec6-ad3e-0f3009cde6a3-serving-cert\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.889475 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.889450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7gxv\" (UniqueName: \"kubernetes.io/projected/646dbece-2434-4ec6-ad3e-0f3009cde6a3-kube-api-access-n7gxv\") pod \"service-ca-operator-69965bb79d-p68j9\" (UID: \"646dbece-2434-4ec6-ad3e-0f3009cde6a3\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:39.937319 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.937283 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6"] Apr 16 18:04:39.940230 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.940215 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:39.942744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.942722 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.942744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.942733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.942744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.942739 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:04:39.943101 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.943087 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:04:39.943265 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.943247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-l7b55\"" Apr 16 18:04:39.945554 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.945532 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6g6wf"] Apr 16 18:04:39.948396 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.948360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:39.952687 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.952669 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:39.958483 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.958455 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:39.958580 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.958563 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:04:39.958673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.958650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:04:39.959238 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.959223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-5vvdl\"" Apr 16 18:04:39.959944 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.959926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6"] Apr 16 18:04:39.967068 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.967049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:04:39.974460 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.974427 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6g6wf"] Apr 16 18:04:39.980509 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71948437-c3bf-419e-8170-14db67f520f4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:39.980643 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwz75\" (UniqueName: \"kubernetes.io/projected/879e1a6e-7abe-4b70-9fdd-76b30b854006-kube-api-access-jwz75\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.980698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-997vj\" (UniqueName: \"kubernetes.io/projected/71948437-c3bf-419e-8170-14db67f520f4-kube-api-access-997vj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:39.980698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-trusted-ca\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:39.980764 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.980764 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.980856 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.980856 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/879e1a6e-7abe-4b70-9fdd-76b30b854006-serving-cert\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.980856 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:39.980852 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:39.981003 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39964503-ddba-4c2e-9063-e712eb49041b-serving-cert\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:39.981003 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:39.980912 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.480892275 +0000 UTC m=+154.593699723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:39.981003 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.980970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-tmp\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.981156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981003 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71948437-c3bf-419e-8170-14db67f520f4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:39.981156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-747zh\" (UniqueName: \"kubernetes.io/projected/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-kube-api-access-747zh\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.981156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-snapshots\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.981156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5wt\" (UniqueName: \"kubernetes.io/projected/39964503-ddba-4c2e-9063-e712eb49041b-kube-api-access-ng5wt\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:39.981156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-config\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:39.981408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981169 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtj2r\" (UniqueName: \"kubernetes.io/projected/bbff1a65-2590-46a2-b70c-3bc7271945eb-kube-api-access-dtj2r\") pod \"volume-data-source-validator-7d955d5dd4-5crmb\" (UID: \"bbff1a65-2590-46a2-b70c-3bc7271945eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" Apr 16 18:04:39.981408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.981408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-tmp\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.981884 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.981962 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/879e1a6e-7abe-4b70-9fdd-76b30b854006-snapshots\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.981962 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.981917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:39.982453 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.982438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/879e1a6e-7abe-4b70-9fdd-76b30b854006-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:39.983708 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:39.983660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/879e1a6e-7abe-4b70-9fdd-76b30b854006-serving-cert\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:40.002237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.002205 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-747zh\" (UniqueName: \"kubernetes.io/projected/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-kube-api-access-747zh\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:40.002527 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.002503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtj2r\" (UniqueName: \"kubernetes.io/projected/bbff1a65-2590-46a2-b70c-3bc7271945eb-kube-api-access-dtj2r\") pod \"volume-data-source-validator-7d955d5dd4-5crmb\" (UID: \"bbff1a65-2590-46a2-b70c-3bc7271945eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" Apr 16 18:04:40.002654 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.002637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwz75\" (UniqueName: \"kubernetes.io/projected/879e1a6e-7abe-4b70-9fdd-76b30b854006-kube-api-access-jwz75\") pod \"insights-operator-5785d4fcdd-lbrtk\" (UID: \"879e1a6e-7abe-4b70-9fdd-76b30b854006\") " pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:40.045215 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.045179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" Apr 16 18:04:40.082531 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-trusted-ca\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39964503-ddba-4c2e-9063-e712eb49041b-serving-cert\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71948437-c3bf-419e-8170-14db67f520f4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5wt\" (UniqueName: \"kubernetes.io/projected/39964503-ddba-4c2e-9063-e712eb49041b-kube-api-access-ng5wt\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-config\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71948437-c3bf-419e-8170-14db67f520f4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.082724 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.082682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-997vj\" (UniqueName: \"kubernetes.io/projected/71948437-c3bf-419e-8170-14db67f520f4-kube-api-access-997vj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.083334 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.083278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71948437-c3bf-419e-8170-14db67f520f4-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.083522 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.083503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-trusted-ca\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.084304 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.084274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39964503-ddba-4c2e-9063-e712eb49041b-config\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.084895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.084876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71948437-c3bf-419e-8170-14db67f520f4-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.085181 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.085164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39964503-ddba-4c2e-9063-e712eb49041b-serving-cert\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.091917 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.091853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-997vj\" (UniqueName: \"kubernetes.io/projected/71948437-c3bf-419e-8170-14db67f520f4-kube-api-access-997vj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-zrbs6\" (UID: \"71948437-c3bf-419e-8170-14db67f520f4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.092036 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.092013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5wt\" (UniqueName: \"kubernetes.io/projected/39964503-ddba-4c2e-9063-e712eb49041b-kube-api-access-ng5wt\") pod \"console-operator-d87b8d5fc-6g6wf\" (UID: \"39964503-ddba-4c2e-9063-e712eb49041b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.140728 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.140692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" Apr 16 18:04:40.155545 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.155515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" Apr 16 18:04:40.185643 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.185586 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9"] Apr 16 18:04:40.189066 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:40.189027 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod646dbece_2434_4ec6_ad3e_0f3009cde6a3.slice/crio-09d2862976c0a6e7da16cb9eb72939d7c0e3bf5e5d8e569a3a884c92608b3673 WatchSource:0}: Error finding container 09d2862976c0a6e7da16cb9eb72939d7c0e3bf5e5d8e569a3a884c92608b3673: Status 404 returned error can't find the container with id 09d2862976c0a6e7da16cb9eb72939d7c0e3bf5e5d8e569a3a884c92608b3673 Apr 16 18:04:40.249855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.249824 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" Apr 16 18:04:40.256808 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.256759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:40.273184 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.273137 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb"] Apr 16 18:04:40.276467 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:40.276436 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbff1a65_2590_46a2_b70c_3bc7271945eb.slice/crio-6f645d4dec21d9c7b78bb05814909fb14ba1f433d51be3647f1ed850e46789de WatchSource:0}: Error finding container 6f645d4dec21d9c7b78bb05814909fb14ba1f433d51be3647f1ed850e46789de: Status 404 returned error can't find the container with id 6f645d4dec21d9c7b78bb05814909fb14ba1f433d51be3647f1ed850e46789de Apr 16 18:04:40.298715 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.298659 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-lbrtk"] Apr 16 18:04:40.303080 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:40.303032 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod879e1a6e_7abe_4b70_9fdd_76b30b854006.slice/crio-0a3f0d1d317f063ffe5114195be8da0506df117a967666dffe637ac891eea607 WatchSource:0}: Error finding container 0a3f0d1d317f063ffe5114195be8da0506df117a967666dffe637ac891eea607: Status 404 returned error can't find the container with id 0a3f0d1d317f063ffe5114195be8da0506df117a967666dffe637ac891eea607 Apr 16 18:04:40.410146 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.410112 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6"] Apr 16 18:04:40.413642 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:40.413613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71948437_c3bf_419e_8170_14db67f520f4.slice/crio-fa244079430aec7cf1153f8c4c6ba8f6a5dee06ea754f8a3ee10772314abe5bc WatchSource:0}: Error finding container fa244079430aec7cf1153f8c4c6ba8f6a5dee06ea754f8a3ee10772314abe5bc: Status 404 returned error can't find the container with id fa244079430aec7cf1153f8c4c6ba8f6a5dee06ea754f8a3ee10772314abe5bc Apr 16 18:04:40.420256 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.420234 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6g6wf"] Apr 16 18:04:40.422261 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:40.422238 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39964503_ddba_4c2e_9063_e712eb49041b.slice/crio-d80e8e700876e1ea669b82acd3fe141226fbf51f691de5764f6481b524a86d52 WatchSource:0}: Error finding container d80e8e700876e1ea669b82acd3fe141226fbf51f691de5764f6481b524a86d52: Status 404 returned error can't find the container with id d80e8e700876e1ea669b82acd3fe141226fbf51f691de5764f6481b524a86d52 Apr 16 18:04:40.487310 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.487213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:40.487496 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:40.487397 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:40.487496 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:40.487477 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:41.487457035 +0000 UTC m=+155.600264480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:40.909206 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.909135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" event={"ID":"39964503-ddba-4c2e-9063-e712eb49041b","Type":"ContainerStarted","Data":"d80e8e700876e1ea669b82acd3fe141226fbf51f691de5764f6481b524a86d52"} Apr 16 18:04:40.910927 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.910889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" event={"ID":"71948437-c3bf-419e-8170-14db67f520f4","Type":"ContainerStarted","Data":"fa244079430aec7cf1153f8c4c6ba8f6a5dee06ea754f8a3ee10772314abe5bc"} Apr 16 18:04:40.912222 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.912177 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" event={"ID":"879e1a6e-7abe-4b70-9fdd-76b30b854006","Type":"ContainerStarted","Data":"0a3f0d1d317f063ffe5114195be8da0506df117a967666dffe637ac891eea607"} Apr 16 18:04:40.916135 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.916095 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" event={"ID":"bbff1a65-2590-46a2-b70c-3bc7271945eb","Type":"ContainerStarted","Data":"6f645d4dec21d9c7b78bb05814909fb14ba1f433d51be3647f1ed850e46789de"} Apr 16 18:04:40.918887 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:40.918849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" event={"ID":"646dbece-2434-4ec6-ad3e-0f3009cde6a3","Type":"ContainerStarted","Data":"09d2862976c0a6e7da16cb9eb72939d7c0e3bf5e5d8e569a3a884c92608b3673"} Apr 16 18:04:41.496382 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.496322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:41.496592 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:41.496496 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:41.496592 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:41.496588 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:43.496559626 +0000 UTC m=+157.609367064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:41.786095 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.785863 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh"] Apr 16 18:04:41.789216 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.789156 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" Apr 16 18:04:41.792871 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.792645 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pnc4z\"" Apr 16 18:04:41.795352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.795187 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh"] Apr 16 18:04:41.838843 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:41.838774 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" podUID="3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" Apr 16 18:04:41.853036 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:41.852983 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-77v8t" podUID="6429bf79-1554-458a-8ed2-de631c73ca89" Apr 16 18:04:41.868388 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:41.868308 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8rjlz" podUID="176aef22-2713-42bf-81d6-9602a79bf10f" Apr 16 18:04:41.902008 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.901800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttnw\" (UniqueName: \"kubernetes.io/projected/8e7aafa9-3cde-4bad-8aa1-447548a5edaa-kube-api-access-vttnw\") pod \"network-check-source-7b678d77c7-6dpmh\" (UID: \"8e7aafa9-3cde-4bad-8aa1-447548a5edaa\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" Apr 16 18:04:41.921518 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.921486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:04:41.921518 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.921503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:04:41.922025 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:41.921506 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77v8t" Apr 16 18:04:42.003589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:42.003296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vttnw\" (UniqueName: \"kubernetes.io/projected/8e7aafa9-3cde-4bad-8aa1-447548a5edaa-kube-api-access-vttnw\") pod \"network-check-source-7b678d77c7-6dpmh\" (UID: \"8e7aafa9-3cde-4bad-8aa1-447548a5edaa\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" Apr 16 18:04:42.011629 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:42.011601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttnw\" (UniqueName: \"kubernetes.io/projected/8e7aafa9-3cde-4bad-8aa1-447548a5edaa-kube-api-access-vttnw\") pod \"network-check-source-7b678d77c7-6dpmh\" (UID: \"8e7aafa9-3cde-4bad-8aa1-447548a5edaa\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" Apr 16 18:04:42.106197 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:42.106161 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" Apr 16 18:04:43.492809 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:43.492766 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-znzwl" podUID="f009e89a-5e15-4d47-81de-24ab98cb437b" Apr 16 18:04:43.518223 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:43.518185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:43.518403 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:43.518343 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:43.518474 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:43.518463 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:47.51844496 +0000 UTC m=+161.631252412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:44.146020 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.145895 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh"] Apr 16 18:04:44.930386 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.930338 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/0.log" Apr 16 18:04:44.930939 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.930418 2577 generic.go:358] "Generic (PLEG): container finished" podID="39964503-ddba-4c2e-9063-e712eb49041b" containerID="fbc752aa742178c61ee1ccad66ef11a315820365f3e6281031cf19f6f39b3f7b" exitCode=255 Apr 16 18:04:44.930939 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.930490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" event={"ID":"39964503-ddba-4c2e-9063-e712eb49041b","Type":"ContainerDied","Data":"fbc752aa742178c61ee1ccad66ef11a315820365f3e6281031cf19f6f39b3f7b"} Apr 16 18:04:44.930939 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.930922 2577 scope.go:117] "RemoveContainer" containerID="fbc752aa742178c61ee1ccad66ef11a315820365f3e6281031cf19f6f39b3f7b" Apr 16 18:04:44.931988 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.931930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" event={"ID":"8e7aafa9-3cde-4bad-8aa1-447548a5edaa","Type":"ContainerStarted","Data":"b2a79a25211441dadd88395280f8b374f491585292e49711a87cc61bf4e790f9"} Apr 16 18:04:44.931988 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.931961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" event={"ID":"8e7aafa9-3cde-4bad-8aa1-447548a5edaa","Type":"ContainerStarted","Data":"ee0f3a0ce476d9a53c9a3d65f7c299329a3051a1604819136ab9f184015ef4f8"} Apr 16 18:04:44.939301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.939266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" event={"ID":"71948437-c3bf-419e-8170-14db67f520f4","Type":"ContainerStarted","Data":"baf1fcb4ca5151743b1cb84e1ada675ea681dfd04544ed79c18d7f9b51e92aa8"} Apr 16 18:04:44.941948 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.941649 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" event={"ID":"879e1a6e-7abe-4b70-9fdd-76b30b854006","Type":"ContainerStarted","Data":"74176636e1d6e9f4b810112bd3d17c43d3215d7b12b600ab3c0944971a071940"} Apr 16 18:04:44.943151 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.943110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" event={"ID":"bbff1a65-2590-46a2-b70c-3bc7271945eb","Type":"ContainerStarted","Data":"7de7145ff4a3324aae1a84d6575884c7fd40228352c548b271553c30853adfa6"} Apr 16 18:04:44.944756 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.944725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" event={"ID":"646dbece-2434-4ec6-ad3e-0f3009cde6a3","Type":"ContainerStarted","Data":"f4c7d9ae014e83c794bd07a0154c869bc6e24149b2b8432e2ddf7d8493b4e1dc"} Apr 16 18:04:44.995109 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.994357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" podStartSLOduration=2.40554662 podStartE2EDuration="5.994339308s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:04:40.419207423 +0000 UTC m=+154.532014859" lastFinishedPulling="2026-04-16 18:04:44.0080001 +0000 UTC m=+158.120807547" observedRunningTime="2026-04-16 18:04:44.993500698 +0000 UTC m=+159.106308155" watchObservedRunningTime="2026-04-16 18:04:44.994339308 +0000 UTC m=+159.107146742" Apr 16 18:04:44.995109 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:44.995053 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" podStartSLOduration=2.300960028 podStartE2EDuration="5.995044765s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:04:40.305699074 +0000 UTC m=+154.418506514" lastFinishedPulling="2026-04-16 18:04:43.999783813 +0000 UTC m=+158.112591251" observedRunningTime="2026-04-16 18:04:44.969854294 +0000 UTC m=+159.082661749" watchObservedRunningTime="2026-04-16 18:04:44.995044765 +0000 UTC m=+159.107852221" Apr 16 18:04:45.010314 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.010260 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-6dpmh" podStartSLOduration=4.010238257 podStartE2EDuration="4.010238257s" podCreationTimestamp="2026-04-16 18:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:45.010234912 +0000 UTC m=+159.123042368" watchObservedRunningTime="2026-04-16 18:04:45.010238257 +0000 UTC m=+159.123045711" Apr 16 18:04:45.035650 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.031667 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" podStartSLOduration=2.225665188 podStartE2EDuration="6.031647368s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:04:40.193149489 +0000 UTC m=+154.305956925" lastFinishedPulling="2026-04-16 18:04:43.99913167 +0000 UTC m=+158.111939105" observedRunningTime="2026-04-16 18:04:45.030048958 +0000 UTC m=+159.142856415" watchObservedRunningTime="2026-04-16 18:04:45.031647368 +0000 UTC m=+159.144454812" Apr 16 18:04:45.694770 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.694680 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-5crmb" podStartSLOduration=2.976254756 podStartE2EDuration="6.694659503s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:04:40.278927043 +0000 UTC m=+154.391734475" lastFinishedPulling="2026-04-16 18:04:43.997331777 +0000 UTC m=+158.110139222" observedRunningTime="2026-04-16 18:04:45.058200188 +0000 UTC m=+159.171007644" watchObservedRunningTime="2026-04-16 18:04:45.694659503 +0000 UTC m=+159.807466956" Apr 16 18:04:45.695970 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.695949 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2"] Apr 16 18:04:45.699068 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.699052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" Apr 16 18:04:45.701611 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.701584 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:04:45.701744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.701585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:45.701744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.701596 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-b5g5m\"" Apr 16 18:04:45.706903 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.706874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2"] Apr 16 18:04:45.842790 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.842754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5r9x\" (UniqueName: \"kubernetes.io/projected/22606e36-9830-4f6a-a7e5-3577a6591eb5-kube-api-access-d5r9x\") pod \"migrator-64d4d94569-z9bt2\" (UID: \"22606e36-9830-4f6a-a7e5-3577a6591eb5\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" Apr 16 18:04:45.943942 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.943902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5r9x\" (UniqueName: \"kubernetes.io/projected/22606e36-9830-4f6a-a7e5-3577a6591eb5-kube-api-access-d5r9x\") pod \"migrator-64d4d94569-z9bt2\" (UID: \"22606e36-9830-4f6a-a7e5-3577a6591eb5\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" Apr 16 18:04:45.949068 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.948997 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/1.log" Apr 16 18:04:45.949354 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.949339 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/0.log" Apr 16 18:04:45.949467 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.949394 2577 generic.go:358] "Generic (PLEG): container finished" podID="39964503-ddba-4c2e-9063-e712eb49041b" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" exitCode=255 Apr 16 18:04:45.949467 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.949425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" event={"ID":"39964503-ddba-4c2e-9063-e712eb49041b","Type":"ContainerDied","Data":"cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24"} Apr 16 18:04:45.949570 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.949477 2577 scope.go:117] "RemoveContainer" containerID="fbc752aa742178c61ee1ccad66ef11a315820365f3e6281031cf19f6f39b3f7b" Apr 16 18:04:45.949781 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.949754 2577 scope.go:117] "RemoveContainer" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" Apr 16 18:04:45.949991 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:45.949966 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:04:45.951610 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:45.951590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5r9x\" (UniqueName: \"kubernetes.io/projected/22606e36-9830-4f6a-a7e5-3577a6591eb5-kube-api-access-d5r9x\") pod \"migrator-64d4d94569-z9bt2\" (UID: \"22606e36-9830-4f6a-a7e5-3577a6591eb5\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" Apr 16 18:04:46.009162 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.009125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" Apr 16 18:04:46.123594 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.123562 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2"] Apr 16 18:04:46.126660 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:46.126628 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22606e36_9830_4f6a_a7e5_3577a6591eb5.slice/crio-44f9913f75da3735ce532900331e7316e831e024416d7f9d716048dafdf63567 WatchSource:0}: Error finding container 44f9913f75da3735ce532900331e7316e831e024416d7f9d716048dafdf63567: Status 404 returned error can't find the container with id 44f9913f75da3735ce532900331e7316e831e024416d7f9d716048dafdf63567 Apr 16 18:04:46.750272 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.750238 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn"] Apr 16 18:04:46.752041 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.752010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") pod \"image-registry-85dc8575d4-v85pb\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:04:46.752182 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.752167 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:04:46.752235 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.752185 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85dc8575d4-v85pb: secret "image-registry-tls" not found Apr 16 18:04:46.752282 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.752245 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls podName:3e36eb57-6fbb-4969-815e-9ff1adf4c8f4 nodeName:}" failed. No retries permitted until 2026-04-16 18:06:48.752227066 +0000 UTC m=+282.865034497 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls") pod "image-registry-85dc8575d4-v85pb" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4") : secret "image-registry-tls" not found Apr 16 18:04:46.753504 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.753484 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.755605 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.755582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-g2cnb\"" Apr 16 18:04:46.755768 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.755747 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:04:46.756123 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.756104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:04:46.772130 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.772101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn"] Apr 16 18:04:46.852993 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.852963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0864a-c403-4e89-a598-d2a7ec22d2fc-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.853147 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.852999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.853147 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.853030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:04:46.853229 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.853143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:04:46.853229 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.853148 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:04:46.853315 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.853225 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:04:46.853315 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.853242 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert podName:176aef22-2713-42bf-81d6-9602a79bf10f nodeName:}" failed. No retries permitted until 2026-04-16 18:06:48.853223788 +0000 UTC m=+282.966031231 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert") pod "ingress-canary-8rjlz" (UID: "176aef22-2713-42bf-81d6-9602a79bf10f") : secret "canary-serving-cert" not found Apr 16 18:04:46.853315 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.853293 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls podName:6429bf79-1554-458a-8ed2-de631c73ca89 nodeName:}" failed. No retries permitted until 2026-04-16 18:06:48.853275422 +0000 UTC m=+282.966082864 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls") pod "dns-default-77v8t" (UID: "6429bf79-1554-458a-8ed2-de631c73ca89") : secret "dns-default-metrics-tls" not found Apr 16 18:04:46.953511 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.953479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0864a-c403-4e89-a598-d2a7ec22d2fc-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.953989 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.953522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.953989 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.953753 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:46.953989 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.953830 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert podName:69a0864a-c403-4e89-a598-d2a7ec22d2fc nodeName:}" failed. No retries permitted until 2026-04-16 18:04:47.453811619 +0000 UTC m=+161.566619051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bsnjn" (UID: "69a0864a-c403-4e89-a598-d2a7ec22d2fc") : secret "networking-console-plugin-cert" not found Apr 16 18:04:46.954277 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.954234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/69a0864a-c403-4e89-a598-d2a7ec22d2fc-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:46.954637 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.954615 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/1.log" Apr 16 18:04:46.955070 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.955050 2577 scope.go:117] "RemoveContainer" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" Apr 16 18:04:46.955302 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:46.955256 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:04:46.955860 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:46.955837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" event={"ID":"22606e36-9830-4f6a-a7e5-3577a6591eb5","Type":"ContainerStarted","Data":"44f9913f75da3735ce532900331e7316e831e024416d7f9d716048dafdf63567"} Apr 16 18:04:47.409092 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.409020 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qrflg_be8a29c6-c9c8-407b-9a79-1120ab614958/dns-node-resolver/0.log" Apr 16 18:04:47.457424 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.457389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:47.457563 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:47.457530 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:47.457603 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:47.457589 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert podName:69a0864a-c403-4e89-a598-d2a7ec22d2fc nodeName:}" failed. No retries permitted until 2026-04-16 18:04:48.457573631 +0000 UTC m=+162.570381068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bsnjn" (UID: "69a0864a-c403-4e89-a598-d2a7ec22d2fc") : secret "networking-console-plugin-cert" not found Apr 16 18:04:47.557962 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.557929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:47.558097 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:47.558051 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:47.558147 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:47.558105 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:04:55.558089285 +0000 UTC m=+169.670896718 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:47.959869 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.959835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" event={"ID":"22606e36-9830-4f6a-a7e5-3577a6591eb5","Type":"ContainerStarted","Data":"57501924c9beb0d334b2aa9eaad8459677d184cee59f2a4c929681925d669f8d"} Apr 16 18:04:47.959869 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.959869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" event={"ID":"22606e36-9830-4f6a-a7e5-3577a6591eb5","Type":"ContainerStarted","Data":"b4c011f3d5cca8dd7f46dd3bf2288d3726ecea2b885fc8d06b44e40a6ff4f616"} Apr 16 18:04:47.975672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:47.975629 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-z9bt2" podStartSLOduration=2.014477473 podStartE2EDuration="2.975615756s" podCreationTimestamp="2026-04-16 18:04:45 +0000 UTC" firstStartedPulling="2026-04-16 18:04:46.128409563 +0000 UTC m=+160.241216995" lastFinishedPulling="2026-04-16 18:04:47.089547831 +0000 UTC m=+161.202355278" observedRunningTime="2026-04-16 18:04:47.975168513 +0000 UTC m=+162.087975968" watchObservedRunningTime="2026-04-16 18:04:47.975615756 +0000 UTC m=+162.088423209" Apr 16 18:04:48.222530 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.222452 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9jw2z"] Apr 16 18:04:48.225445 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.225430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.227754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.227725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:04:48.227754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.227748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:04:48.227940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.227785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:04:48.227940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.227795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:04:48.227940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.227787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-m9zfz\"" Apr 16 18:04:48.233967 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.233947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9jw2z"] Apr 16 18:04:48.364992 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.364959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-key\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.364992 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.365008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8ks\" (UniqueName: \"kubernetes.io/projected/5769d5a3-20c8-41e8-9c04-12fedad90621-kube-api-access-ml8ks\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.365247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.365157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-cabundle\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.408796 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.408766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2zfb_24b24bc6-a399-4980-9de1-8258c56623b3/node-ca/0.log" Apr 16 18:04:48.465591 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.465559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-key\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.465749 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.465615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8ks\" (UniqueName: \"kubernetes.io/projected/5769d5a3-20c8-41e8-9c04-12fedad90621-kube-api-access-ml8ks\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.465749 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.465717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-cabundle\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.465874 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.465745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:48.465928 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:48.465872 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:48.465976 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:48.465952 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert podName:69a0864a-c403-4e89-a598-d2a7ec22d2fc nodeName:}" failed. No retries permitted until 2026-04-16 18:04:50.465932224 +0000 UTC m=+164.578739656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bsnjn" (UID: "69a0864a-c403-4e89-a598-d2a7ec22d2fc") : secret "networking-console-plugin-cert" not found Apr 16 18:04:48.466387 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.466352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-cabundle\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.468037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.468021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5769d5a3-20c8-41e8-9c04-12fedad90621-signing-key\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.474102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.474051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8ks\" (UniqueName: \"kubernetes.io/projected/5769d5a3-20c8-41e8-9c04-12fedad90621-kube-api-access-ml8ks\") pod \"service-ca-bfc587fb7-9jw2z\" (UID: \"5769d5a3-20c8-41e8-9c04-12fedad90621\") " pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.534782 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.534744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" Apr 16 18:04:48.663553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.663519 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-9jw2z"] Apr 16 18:04:48.666208 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:04:48.666178 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5769d5a3_20c8_41e8_9c04_12fedad90621.slice/crio-0ac9847be87447d1f253d6099f18f8aac728e90bcb07202bc1d9847d7aa456b5 WatchSource:0}: Error finding container 0ac9847be87447d1f253d6099f18f8aac728e90bcb07202bc1d9847d7aa456b5: Status 404 returned error can't find the container with id 0ac9847be87447d1f253d6099f18f8aac728e90bcb07202bc1d9847d7aa456b5 Apr 16 18:04:48.964294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.964258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" event={"ID":"5769d5a3-20c8-41e8-9c04-12fedad90621","Type":"ContainerStarted","Data":"add34cda2d0608e1a91c60c8cd7bd058df50a9935082318dcafc292e7fbb21a0"} Apr 16 18:04:48.964294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.964294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" event={"ID":"5769d5a3-20c8-41e8-9c04-12fedad90621","Type":"ContainerStarted","Data":"0ac9847be87447d1f253d6099f18f8aac728e90bcb07202bc1d9847d7aa456b5"} Apr 16 18:04:48.980386 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:48.980317 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-9jw2z" podStartSLOduration=0.980302978 podStartE2EDuration="980.302978ms" podCreationTimestamp="2026-04-16 18:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:48.979793113 +0000 UTC m=+163.092600566" watchObservedRunningTime="2026-04-16 18:04:48.980302978 +0000 UTC m=+163.093110432" Apr 16 18:04:50.257946 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:50.257857 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:50.257946 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:50.257901 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:04:50.258466 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:50.258298 2577 scope.go:117] "RemoveContainer" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" Apr 16 18:04:50.258524 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:50.258478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:04:50.483114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:50.483061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:50.483269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:50.483186 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:50.483269 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:50.483243 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert podName:69a0864a-c403-4e89-a598-d2a7ec22d2fc nodeName:}" failed. No retries permitted until 2026-04-16 18:04:54.4832287 +0000 UTC m=+168.596036131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bsnjn" (UID: "69a0864a-c403-4e89-a598-d2a7ec22d2fc") : secret "networking-console-plugin-cert" not found Apr 16 18:04:54.516009 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:54.515966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:04:54.516411 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:54.516143 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:04:54.516411 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:54.516228 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert podName:69a0864a-c403-4e89-a598-d2a7ec22d2fc nodeName:}" failed. No retries permitted until 2026-04-16 18:05:02.516207267 +0000 UTC m=+176.629014702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bsnjn" (UID: "69a0864a-c403-4e89-a598-d2a7ec22d2fc") : secret "networking-console-plugin-cert" not found Apr 16 18:04:55.626098 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:55.626058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:04:55.626593 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:55.626191 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:55.626593 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:04:55.626331 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls podName:15702bb5-aa4d-4152-b4a2-faadc3c7fa5f nodeName:}" failed. No retries permitted until 2026-04-16 18:05:11.626307336 +0000 UTC m=+185.739114774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-cmpk5" (UID: "15702bb5-aa4d-4152-b4a2-faadc3c7fa5f") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:04:58.473119 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:04:58.473074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:05:02.586253 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:02.586215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:05:02.588542 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:02.588523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/69a0864a-c403-4e89-a598-d2a7ec22d2fc-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bsnjn\" (UID: \"69a0864a-c403-4e89-a598-d2a7ec22d2fc\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:05:02.663991 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:02.663964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" Apr 16 18:05:02.777220 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:02.777186 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn"] Apr 16 18:05:02.780577 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:02.780549 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a0864a_c403_4e89_a598_d2a7ec22d2fc.slice/crio-a1454fed8a88cbae6a05b6f0183bf66e9e5ee9e52fed9a9f7bbe64ed63e921d9 WatchSource:0}: Error finding container a1454fed8a88cbae6a05b6f0183bf66e9e5ee9e52fed9a9f7bbe64ed63e921d9: Status 404 returned error can't find the container with id a1454fed8a88cbae6a05b6f0183bf66e9e5ee9e52fed9a9f7bbe64ed63e921d9 Apr 16 18:05:03.008317 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:03.008230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" event={"ID":"69a0864a-c403-4e89-a598-d2a7ec22d2fc","Type":"ContainerStarted","Data":"a1454fed8a88cbae6a05b6f0183bf66e9e5ee9e52fed9a9f7bbe64ed63e921d9"} Apr 16 18:05:04.011842 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:04.011799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" event={"ID":"69a0864a-c403-4e89-a598-d2a7ec22d2fc","Type":"ContainerStarted","Data":"661bf51aa57f22c90ee9a4e46a53f41d550d9879f5845e96e2d461706892080a"} Apr 16 18:05:04.027293 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:04.027247 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bsnjn" podStartSLOduration=16.973479175 podStartE2EDuration="18.027233742s" podCreationTimestamp="2026-04-16 18:04:46 +0000 UTC" firstStartedPulling="2026-04-16 18:05:02.782410207 +0000 UTC m=+176.895217639" lastFinishedPulling="2026-04-16 18:05:03.836164774 +0000 UTC m=+177.948972206" observedRunningTime="2026-04-16 18:05:04.02633566 +0000 UTC m=+178.139143114" watchObservedRunningTime="2026-04-16 18:05:04.027233742 +0000 UTC m=+178.140041196" Apr 16 18:05:05.473149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:05.473117 2577 scope.go:117] "RemoveContainer" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" Apr 16 18:05:06.017947 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.017922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:05:06.018300 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.018283 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/1.log" Apr 16 18:05:06.018349 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.018319 2577 generic.go:358] "Generic (PLEG): container finished" podID="39964503-ddba-4c2e-9063-e712eb49041b" containerID="ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370" exitCode=255 Apr 16 18:05:06.018416 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.018393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" event={"ID":"39964503-ddba-4c2e-9063-e712eb49041b","Type":"ContainerDied","Data":"ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370"} Apr 16 18:05:06.018472 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.018438 2577 scope.go:117] "RemoveContainer" containerID="cd46c5180c65802ca628ae14d186e67de2c32d23857d0dc649a0045f4e79af24" Apr 16 18:05:06.018782 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:06.018764 2577 scope.go:117] "RemoveContainer" containerID="ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370" Apr 16 18:05:06.018986 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:06.018966 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:05:07.022709 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:07.022678 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:05:10.257094 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.257063 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:05:10.257094 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.257095 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:05:10.257523 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.257467 2577 scope.go:117] "RemoveContainer" containerID="ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370" Apr 16 18:05:10.257641 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:10.257623 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:05:10.293533 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.293498 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sl5k7"] Apr 16 18:05:10.298628 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.298607 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.300938 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.300919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:05:10.301571 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.301558 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:05:10.305547 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.305528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-92dpg\"" Apr 16 18:05:10.318458 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.318434 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sl5k7"] Apr 16 18:05:10.353540 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.353508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4c10821a-0056-41c9-86ef-a22224af6e3c-crio-socket\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.353540 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.353540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxbq2\" (UniqueName: \"kubernetes.io/projected/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-api-access-bxbq2\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.353692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.353569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.353692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.353623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4c10821a-0056-41c9-86ef-a22224af6e3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.353814 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.353794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c10821a-0056-41c9-86ef-a22224af6e3c-data-volume\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.454828 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c10821a-0056-41c9-86ef-a22224af6e3c-data-volume\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4c10821a-0056-41c9-86ef-a22224af6e3c-crio-socket\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxbq2\" (UniqueName: \"kubernetes.io/projected/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-api-access-bxbq2\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4c10821a-0056-41c9-86ef-a22224af6e3c-crio-socket\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.454973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4c10821a-0056-41c9-86ef-a22224af6e3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455359 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.455339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4c10821a-0056-41c9-86ef-a22224af6e3c-data-volume\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.455574 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.455555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.457236 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.457216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4c10821a-0056-41c9-86ef-a22224af6e3c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.463698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.463650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxbq2\" (UniqueName: \"kubernetes.io/projected/4c10821a-0056-41c9-86ef-a22224af6e3c-kube-api-access-bxbq2\") pod \"insights-runtime-extractor-sl5k7\" (UID: \"4c10821a-0056-41c9-86ef-a22224af6e3c\") " pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.607611 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.607583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sl5k7" Apr 16 18:05:10.741994 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:10.741950 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sl5k7"] Apr 16 18:05:10.746276 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:10.746248 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c10821a_0056_41c9_86ef_a22224af6e3c.slice/crio-018fdd5830f92335986cf05516239989074da1032f8a471ef4838dc44fb35f23 WatchSource:0}: Error finding container 018fdd5830f92335986cf05516239989074da1032f8a471ef4838dc44fb35f23: Status 404 returned error can't find the container with id 018fdd5830f92335986cf05516239989074da1032f8a471ef4838dc44fb35f23 Apr 16 18:05:11.033707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.033624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sl5k7" event={"ID":"4c10821a-0056-41c9-86ef-a22224af6e3c","Type":"ContainerStarted","Data":"8f7ae63b7edc62064413d0c777ee72cfa03069c62217e833e7dceb016f902ece"} Apr 16 18:05:11.033707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.033665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sl5k7" event={"ID":"4c10821a-0056-41c9-86ef-a22224af6e3c","Type":"ContainerStarted","Data":"018fdd5830f92335986cf05516239989074da1032f8a471ef4838dc44fb35f23"} Apr 16 18:05:11.664237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.664167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:05:11.666273 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.666255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/15702bb5-aa4d-4152-b4a2-faadc3c7fa5f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-cmpk5\" (UID: \"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:05:11.962314 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.962230 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4jhmt\"" Apr 16 18:05:11.970999 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:11.970975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" Apr 16 18:05:12.039020 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:12.038977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sl5k7" event={"ID":"4c10821a-0056-41c9-86ef-a22224af6e3c","Type":"ContainerStarted","Data":"c6c1b26268292b9300c2ece10fb0b8eb1806921f40db0c335fe8c9c052e5af9e"} Apr 16 18:05:12.104875 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:12.104843 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5"] Apr 16 18:05:12.107949 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:12.107909 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15702bb5_aa4d_4152_b4a2_faadc3c7fa5f.slice/crio-35d963c5929a7138e6e0282dbf78db8fc0fc34ad3d619b0a0305da801dc4ecc7 WatchSource:0}: Error finding container 35d963c5929a7138e6e0282dbf78db8fc0fc34ad3d619b0a0305da801dc4ecc7: Status 404 returned error can't find the container with id 35d963c5929a7138e6e0282dbf78db8fc0fc34ad3d619b0a0305da801dc4ecc7 Apr 16 18:05:13.043722 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:13.043628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sl5k7" event={"ID":"4c10821a-0056-41c9-86ef-a22224af6e3c","Type":"ContainerStarted","Data":"a798e3bb7440719d97c2f62814e82164d034947db093856f70ff8600dfeb77c7"} Apr 16 18:05:13.044666 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:13.044646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" event={"ID":"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f","Type":"ContainerStarted","Data":"35d963c5929a7138e6e0282dbf78db8fc0fc34ad3d619b0a0305da801dc4ecc7"} Apr 16 18:05:13.061728 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:13.061666 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sl5k7" podStartSLOduration=1.177282766 podStartE2EDuration="3.061646532s" podCreationTimestamp="2026-04-16 18:05:10 +0000 UTC" firstStartedPulling="2026-04-16 18:05:10.803734749 +0000 UTC m=+184.916542182" lastFinishedPulling="2026-04-16 18:05:12.688098515 +0000 UTC m=+186.800905948" observedRunningTime="2026-04-16 18:05:13.060626695 +0000 UTC m=+187.173434146" watchObservedRunningTime="2026-04-16 18:05:13.061646532 +0000 UTC m=+187.174453988" Apr 16 18:05:14.689467 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.689432 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww"] Apr 16 18:05:14.692546 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.692520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:14.696481 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.696451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-8g854\"" Apr 16 18:05:14.696895 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.696878 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:05:14.707837 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.706423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww"] Apr 16 18:05:14.793948 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.793908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-tcjww\" (UID: \"5b420faf-ed52-4f86-9d02-a3f48f948b9e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:14.894609 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:14.894566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-tcjww\" (UID: \"5b420faf-ed52-4f86-9d02-a3f48f948b9e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:14.894792 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:14.894719 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:05:14.894792 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:14.894788 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates podName:5b420faf-ed52-4f86-9d02-a3f48f948b9e nodeName:}" failed. No retries permitted until 2026-04-16 18:05:15.394771659 +0000 UTC m=+189.507579091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-tcjww" (UID: "5b420faf-ed52-4f86-9d02-a3f48f948b9e") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:05:15.051785 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.051696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" event={"ID":"15702bb5-aa4d-4152-b4a2-faadc3c7fa5f","Type":"ContainerStarted","Data":"51da360f8356758590c6d2603d57d94ce4e1f2cb96dd4540513440a0efbdbbaf"} Apr 16 18:05:15.398971 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.398929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-tcjww\" (UID: \"5b420faf-ed52-4f86-9d02-a3f48f948b9e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:15.401317 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.401289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5b420faf-ed52-4f86-9d02-a3f48f948b9e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-tcjww\" (UID: \"5b420faf-ed52-4f86-9d02-a3f48f948b9e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:15.601742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.601703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:15.720519 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.720467 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-cmpk5" podStartSLOduration=34.739685905 podStartE2EDuration="36.720447709s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:05:12.110320317 +0000 UTC m=+186.223127763" lastFinishedPulling="2026-04-16 18:05:14.091082136 +0000 UTC m=+188.203889567" observedRunningTime="2026-04-16 18:05:15.075709647 +0000 UTC m=+189.188517101" watchObservedRunningTime="2026-04-16 18:05:15.720447709 +0000 UTC m=+189.833255176" Apr 16 18:05:15.721529 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:15.721504 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww"] Apr 16 18:05:15.725198 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:15.725171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b420faf_ed52_4f86_9d02_a3f48f948b9e.slice/crio-371bfed699075a19e69873bd0b88876c7c6897f6efeec53ce63c2627ff596545 WatchSource:0}: Error finding container 371bfed699075a19e69873bd0b88876c7c6897f6efeec53ce63c2627ff596545: Status 404 returned error can't find the container with id 371bfed699075a19e69873bd0b88876c7c6897f6efeec53ce63c2627ff596545 Apr 16 18:05:16.055115 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:16.055032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" event={"ID":"5b420faf-ed52-4f86-9d02-a3f48f948b9e","Type":"ContainerStarted","Data":"371bfed699075a19e69873bd0b88876c7c6897f6efeec53ce63c2627ff596545"} Apr 16 18:05:17.061235 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:17.061197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" event={"ID":"5b420faf-ed52-4f86-9d02-a3f48f948b9e","Type":"ContainerStarted","Data":"efda7ab0bc8d9a3b0788dde74f082efb6f15c93ca8241c7e54d47df00875f03e"} Apr 16 18:05:17.061714 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:17.061398 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:17.066201 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:17.066179 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" Apr 16 18:05:17.079170 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:17.079126 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-tcjww" podStartSLOduration=1.946755613 podStartE2EDuration="3.07910183s" podCreationTimestamp="2026-04-16 18:05:14 +0000 UTC" firstStartedPulling="2026-04-16 18:05:15.726966484 +0000 UTC m=+189.839773919" lastFinishedPulling="2026-04-16 18:05:16.859312698 +0000 UTC m=+190.972120136" observedRunningTime="2026-04-16 18:05:17.078951507 +0000 UTC m=+191.191758962" watchObservedRunningTime="2026-04-16 18:05:17.07910183 +0000 UTC m=+191.191909285" Apr 16 18:05:21.473733 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:21.473694 2577 scope.go:117] "RemoveContainer" containerID="ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370" Apr 16 18:05:21.474193 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:21.473943 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6g6wf_openshift-console-operator(39964503-ddba-4c2e-9063-e712eb49041b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podUID="39964503-ddba-4c2e-9063-e712eb49041b" Apr 16 18:05:22.266972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.266939 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd"] Apr 16 18:05:22.270908 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.270876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.274426 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.274394 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:05:22.274588 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.274565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cgrbf\"" Apr 16 18:05:22.274680 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.274588 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:05:22.274680 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.274588 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:05:22.290892 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.290847 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd"] Apr 16 18:05:22.307730 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.307691 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7l6ll"] Apr 16 18:05:22.311583 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.311554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.313608 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.313586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:05:22.314358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.314332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7x7gd\"" Apr 16 18:05:22.314482 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.314407 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:05:22.314872 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.314852 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:05:22.324181 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.324153 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g5cj2"] Apr 16 18:05:22.327399 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.327359 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.329430 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.329406 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-s5wd5\"" Apr 16 18:05:22.329995 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.329977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:05:22.331281 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.331261 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:05:22.331406 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.331388 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:05:22.343567 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.343538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g5cj2"] Apr 16 18:05:22.361436 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-tls\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361436 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45727ce7-e2c5-48b0-b000-4e32997b56df-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-sys\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/070020cc-e67b-4965-a3df-3cf85fed6a85-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.361672 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-wtmp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zc7\" (UniqueName: \"kubernetes.io/projected/45727ce7-e2c5-48b0-b000-4e32997b56df-kube-api-access-t2zc7\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361735 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361777 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-root\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5twzp\" (UniqueName: \"kubernetes.io/projected/69c059ad-8285-4260-ab9e-9163abfdcada-kube-api-access-5twzp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.361890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-metrics-client-ca\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.362108 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7m29\" (UniqueName: \"kubernetes.io/projected/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-api-access-q7m29\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.362108 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.362108 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.361999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.362108 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.362018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-textfile\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462507 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-root\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5twzp\" (UniqueName: \"kubernetes.io/projected/69c059ad-8285-4260-ab9e-9163abfdcada-kube-api-access-5twzp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-metrics-client-ca\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462654 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-root\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7m29\" (UniqueName: \"kubernetes.io/projected/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-api-access-q7m29\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-textfile\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-tls\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45727ce7-e2c5-48b0-b000-4e32997b56df-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.462984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.462970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-sys\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/070020cc-e67b-4965-a3df-3cf85fed6a85-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:22.463081 2577 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:22.463166 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls podName:070020cc-e67b-4965-a3df-3cf85fed6a85 nodeName:}" failed. No retries permitted until 2026-04-16 18:05:22.963146096 +0000 UTC m=+197.075953554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-g5cj2" (UID: "070020cc-e67b-4965-a3df-3cf85fed6a85") : secret "kube-state-metrics-tls" not found Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-metrics-client-ca\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/070020cc-e67b-4965-a3df-3cf85fed6a85-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.463556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-sys\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.463944 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-accelerators-collector-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.463944 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.464055 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45727ce7-e2c5-48b0-b000-4e32997b56df-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.464055 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.463993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.464055 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.464046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-wtmp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.464220 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.464090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zc7\" (UniqueName: \"kubernetes.io/projected/45727ce7-e2c5-48b0-b000-4e32997b56df-kube-api-access-t2zc7\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.464220 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.464169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-textfile\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.464330 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.464304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-wtmp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.464602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.464578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.465704 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.465679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-tls\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.465822 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.465735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.466309 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.466285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69c059ad-8285-4260-ab9e-9163abfdcada-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.466485 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.466468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.466640 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.466621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45727ce7-e2c5-48b0-b000-4e32997b56df-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.486868 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.486834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7m29\" (UniqueName: \"kubernetes.io/projected/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-api-access-q7m29\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.490633 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.490603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5twzp\" (UniqueName: \"kubernetes.io/projected/69c059ad-8285-4260-ab9e-9163abfdcada-kube-api-access-5twzp\") pod \"node-exporter-7l6ll\" (UID: \"69c059ad-8285-4260-ab9e-9163abfdcada\") " pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.490795 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.490762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zc7\" (UniqueName: \"kubernetes.io/projected/45727ce7-e2c5-48b0-b000-4e32997b56df-kube-api-access-t2zc7\") pod \"openshift-state-metrics-5669946b84-kl8xd\" (UID: \"45727ce7-e2c5-48b0-b000-4e32997b56df\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.581717 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.581676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" Apr 16 18:05:22.623804 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.623575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7l6ll" Apr 16 18:05:22.727033 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.726997 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd"] Apr 16 18:05:22.730036 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:22.730007 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45727ce7_e2c5_48b0_b000_4e32997b56df.slice/crio-5ff37b03d2f773e4050ff93f5d9d4c831e5497d50c20a87f213f0d4da2be39b8 WatchSource:0}: Error finding container 5ff37b03d2f773e4050ff93f5d9d4c831e5497d50c20a87f213f0d4da2be39b8: Status 404 returned error can't find the container with id 5ff37b03d2f773e4050ff93f5d9d4c831e5497d50c20a87f213f0d4da2be39b8 Apr 16 18:05:22.969237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.969201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:22.971745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:22.971719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/070020cc-e67b-4965-a3df-3cf85fed6a85-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-g5cj2\" (UID: \"070020cc-e67b-4965-a3df-3cf85fed6a85\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:23.078522 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.078481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l6ll" event={"ID":"69c059ad-8285-4260-ab9e-9163abfdcada","Type":"ContainerStarted","Data":"6a2a36c3357061fac214e049258bf95a0253b2c6110fad38501a0ba3e0aa55f7"} Apr 16 18:05:23.080332 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.080295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" event={"ID":"45727ce7-e2c5-48b0-b000-4e32997b56df","Type":"ContainerStarted","Data":"51876a8fae22083857ae98ed8f32a6ac6a57b0cf737ccac77e99ea5c20dcaa5e"} Apr 16 18:05:23.080498 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.080337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" event={"ID":"45727ce7-e2c5-48b0-b000-4e32997b56df","Type":"ContainerStarted","Data":"5c121839142e95549ffe006cfa28efd4faa949e62f27148d80ef62a118c94d88"} Apr 16 18:05:23.080498 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.080351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" event={"ID":"45727ce7-e2c5-48b0-b000-4e32997b56df","Type":"ContainerStarted","Data":"5ff37b03d2f773e4050ff93f5d9d4c831e5497d50c20a87f213f0d4da2be39b8"} Apr 16 18:05:23.237774 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.237670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" Apr 16 18:05:23.326453 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.326419 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:23.330074 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.330053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.332041 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332014 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:05:23.332815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-n8zmp\"" Apr 16 18:05:23.332815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332667 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:05:23.332815 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332681 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:05:23.333025 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:05:23.333025 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.332840 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:05:23.333125 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.333110 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:05:23.333177 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.333157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:05:23.333429 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.333408 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:05:23.333529 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.333446 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:05:23.346080 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.346050 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:23.373746 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.373703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.373901 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.373765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.373901 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.373859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.373972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.373903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374006 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.373980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374080 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374113 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374148 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374213 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.374358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.374251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6vr\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.473268 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.473104 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-g5cj2"] Apr 16 18:05:23.475250 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.475386 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.475386 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.475912 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:23.475647 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle podName:9c80e072-3897-4516-ba1c-74201dec58e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:05:23.975622789 +0000 UTC m=+198.088430239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:05:23.475912 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476082 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476082 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.475983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476082 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6vr\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476506 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476506 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.476307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.476506 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:23.476493 2577 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:05:23.476659 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:23.476544 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls podName:9c80e072-3897-4516-ba1c-74201dec58e9 nodeName:}" failed. No retries permitted until 2026-04-16 18:05:23.976527911 +0000 UTC m=+198.089335344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9") : secret "alertmanager-main-tls" not found Apr 16 18:05:23.477744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.477316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.480341 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.480291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.482164 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.482138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.484350 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.484013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.484812 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.484660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.484812 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.484711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.484960 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.484823 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.486975 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.486955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.487472 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.487335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.487600 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.487584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6vr\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.983111 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.983061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.983261 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.983123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.983927 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.983898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:23.985532 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:23.985512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:24.085850 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.085810 2577 generic.go:358] "Generic (PLEG): container finished" podID="69c059ad-8285-4260-ab9e-9163abfdcada" containerID="0eb78745cc10352ef1ec1db564be1714261c6be7cc372f0571fe01355c4b6e25" exitCode=0 Apr 16 18:05:24.086242 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.085900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l6ll" event={"ID":"69c059ad-8285-4260-ab9e-9163abfdcada","Type":"ContainerDied","Data":"0eb78745cc10352ef1ec1db564be1714261c6be7cc372f0571fe01355c4b6e25"} Apr 16 18:05:24.088149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.088119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" event={"ID":"45727ce7-e2c5-48b0-b000-4e32997b56df","Type":"ContainerStarted","Data":"4d97c5f66341c7e3e188a6994f8830e13d011dedfec0097cf99c36f7a6b696eb"} Apr 16 18:05:24.089699 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.089677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" event={"ID":"070020cc-e67b-4965-a3df-3cf85fed6a85","Type":"ContainerStarted","Data":"7fe0121dbadd26dc7b0416317b720b5fd77918971409c64d25715db0bb2a3378"} Apr 16 18:05:24.122557 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.122495 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kl8xd" podStartSLOduration=0.978820712 podStartE2EDuration="2.12247402s" podCreationTimestamp="2026-04-16 18:05:22 +0000 UTC" firstStartedPulling="2026-04-16 18:05:22.845769624 +0000 UTC m=+196.958577055" lastFinishedPulling="2026-04-16 18:05:23.989422926 +0000 UTC m=+198.102230363" observedRunningTime="2026-04-16 18:05:24.121507879 +0000 UTC m=+198.234315333" watchObservedRunningTime="2026-04-16 18:05:24.12247402 +0000 UTC m=+198.235281475" Apr 16 18:05:24.241696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.241597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:24.395698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:24.395661 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:24.400090 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:24.400049 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c80e072_3897_4516_ba1c_74201dec58e9.slice/crio-4fea2379db4edff60a9edf5137d8a2718cddd8dd7cbf2f08a00aa39057d5249d WatchSource:0}: Error finding container 4fea2379db4edff60a9edf5137d8a2718cddd8dd7cbf2f08a00aa39057d5249d: Status 404 returned error can't find the container with id 4fea2379db4edff60a9edf5137d8a2718cddd8dd7cbf2f08a00aa39057d5249d Apr 16 18:05:25.102594 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.102554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" event={"ID":"070020cc-e67b-4965-a3df-3cf85fed6a85","Type":"ContainerStarted","Data":"43d92cbc00fc3d56787db0ec37e07e55d097b7d06a1e79489325598178910d96"} Apr 16 18:05:25.102594 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.102601 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" event={"ID":"070020cc-e67b-4965-a3df-3cf85fed6a85","Type":"ContainerStarted","Data":"6a765e1685e182064b1c70486d67dadd572c437eb6c672068e776fda5992de36"} Apr 16 18:05:25.104378 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.104333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"4fea2379db4edff60a9edf5137d8a2718cddd8dd7cbf2f08a00aa39057d5249d"} Apr 16 18:05:25.109538 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.109021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l6ll" event={"ID":"69c059ad-8285-4260-ab9e-9163abfdcada","Type":"ContainerStarted","Data":"dfa3cba763e99bb28d7dc85a6ff287c5afcc06928c417fc77addf8f8ae2579bb"} Apr 16 18:05:25.109538 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.109073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7l6ll" event={"ID":"69c059ad-8285-4260-ab9e-9163abfdcada","Type":"ContainerStarted","Data":"500a3f5a4e13c2346a5cfcf9eae3bb2b6746e8417979745e19696ee395fe1d84"} Apr 16 18:05:25.135417 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:25.135310 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7l6ll" podStartSLOduration=2.391835793 podStartE2EDuration="3.135289951s" podCreationTimestamp="2026-04-16 18:05:22 +0000 UTC" firstStartedPulling="2026-04-16 18:05:22.639283347 +0000 UTC m=+196.752090780" lastFinishedPulling="2026-04-16 18:05:23.3827375 +0000 UTC m=+197.495544938" observedRunningTime="2026-04-16 18:05:25.134017534 +0000 UTC m=+199.246824990" watchObservedRunningTime="2026-04-16 18:05:25.135289951 +0000 UTC m=+199.248097406" Apr 16 18:05:26.113023 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.112989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" event={"ID":"070020cc-e67b-4965-a3df-3cf85fed6a85","Type":"ContainerStarted","Data":"76e48f0c9cf7fc1983b650c51e1498c906344becc1df3bc820830e1255e71406"} Apr 16 18:05:26.114241 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.114214 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4" exitCode=0 Apr 16 18:05:26.114351 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.114300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4"} Apr 16 18:05:26.134746 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.134707 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-g5cj2" podStartSLOduration=2.705777416 podStartE2EDuration="4.13469359s" podCreationTimestamp="2026-04-16 18:05:22 +0000 UTC" firstStartedPulling="2026-04-16 18:05:23.480794451 +0000 UTC m=+197.593601882" lastFinishedPulling="2026-04-16 18:05:24.909710619 +0000 UTC m=+199.022518056" observedRunningTime="2026-04-16 18:05:26.132607076 +0000 UTC m=+200.245414520" watchObservedRunningTime="2026-04-16 18:05:26.13469359 +0000 UTC m=+200.247501069" Apr 16 18:05:26.918997 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.918961 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj"] Apr 16 18:05:26.922797 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.922769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:26.925102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.925078 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:05:26.925220 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.925117 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-r9sz9\"" Apr 16 18:05:26.930979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:26.930163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj"] Apr 16 18:05:27.015178 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:27.015140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dcbf1809-c5f1-459c-a1bc-66069006fd9a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kstnj\" (UID: \"dcbf1809-c5f1-459c-a1bc-66069006fd9a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:27.115909 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:27.115877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dcbf1809-c5f1-459c-a1bc-66069006fd9a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kstnj\" (UID: \"dcbf1809-c5f1-459c-a1bc-66069006fd9a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:27.118744 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:27.118719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dcbf1809-c5f1-459c-a1bc-66069006fd9a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kstnj\" (UID: \"dcbf1809-c5f1-459c-a1bc-66069006fd9a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:27.235196 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:27.234481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:27.371163 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:27.371133 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj"] Apr 16 18:05:27.375495 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:27.375463 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcbf1809_c5f1_459c_a1bc_66069006fd9a.slice/crio-8304f8d5ec1e014eda64b30ec78ca16acde0ce92a54b92fc1748eb6c9830291a WatchSource:0}: Error finding container 8304f8d5ec1e014eda64b30ec78ca16acde0ce92a54b92fc1748eb6c9830291a: Status 404 returned error can't find the container with id 8304f8d5ec1e014eda64b30ec78ca16acde0ce92a54b92fc1748eb6c9830291a Apr 16 18:05:28.122742 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.122682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" event={"ID":"dcbf1809-c5f1-459c-a1bc-66069006fd9a","Type":"ContainerStarted","Data":"8304f8d5ec1e014eda64b30ec78ca16acde0ce92a54b92fc1748eb6c9830291a"} Apr 16 18:05:28.125808 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.125781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08"} Apr 16 18:05:28.125935 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.125815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8"} Apr 16 18:05:28.125935 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.125832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46"} Apr 16 18:05:28.125935 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.125847 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e"} Apr 16 18:05:28.125935 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:28.125861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00"} Apr 16 18:05:29.129891 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.129858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" event={"ID":"dcbf1809-c5f1-459c-a1bc-66069006fd9a","Type":"ContainerStarted","Data":"c298831d0a4522cba31f10ba87554d3ee3d006edd2f6e165ed55881aa58687d6"} Apr 16 18:05:29.130301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.130133 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:29.132942 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.132911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerStarted","Data":"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517"} Apr 16 18:05:29.135304 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.135280 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" Apr 16 18:05:29.152018 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.151930 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kstnj" podStartSLOduration=1.6284813740000001 podStartE2EDuration="3.151919453s" podCreationTimestamp="2026-04-16 18:05:26 +0000 UTC" firstStartedPulling="2026-04-16 18:05:27.377541037 +0000 UTC m=+201.490348474" lastFinishedPulling="2026-04-16 18:05:28.900979116 +0000 UTC m=+203.013786553" observedRunningTime="2026-04-16 18:05:29.149606152 +0000 UTC m=+203.262413603" watchObservedRunningTime="2026-04-16 18:05:29.151919453 +0000 UTC m=+203.264726906" Apr 16 18:05:29.179935 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:29.179876 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.182150483 podStartE2EDuration="6.179861956s" podCreationTimestamp="2026-04-16 18:05:23 +0000 UTC" firstStartedPulling="2026-04-16 18:05:24.402345315 +0000 UTC m=+198.515152747" lastFinishedPulling="2026-04-16 18:05:28.400056773 +0000 UTC m=+202.512864220" observedRunningTime="2026-04-16 18:05:29.17798188 +0000 UTC m=+203.290789335" watchObservedRunningTime="2026-04-16 18:05:29.179861956 +0000 UTC m=+203.292669409" Apr 16 18:05:32.261274 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:32.261230 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85dc8575d4-v85pb"] Apr 16 18:05:32.261739 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:05:32.261535 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" podUID="3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" Apr 16 18:05:32.473426 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:32.473396 2577 scope.go:117] "RemoveContainer" containerID="ff93c972cd2b6444c1cacb7b846da1f7b090c0dd01d2239bbe9201de30a0b370" Apr 16 18:05:33.145716 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.145687 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:05:33.145881 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.145791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:05:33.145881 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.145804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" event={"ID":"39964503-ddba-4c2e-9063-e712eb49041b","Type":"ContainerStarted","Data":"523998ac1d7b4b73ac79e09a32e3cbba37b984d4f9683608ea023ffa27b1c50e"} Apr 16 18:05:33.146216 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.146197 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:05:33.150132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.150116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:05:33.165501 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.165459 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" podStartSLOduration=50.588242464 podStartE2EDuration="54.165443354s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="2026-04-16 18:04:40.423816572 +0000 UTC m=+154.536624004" lastFinishedPulling="2026-04-16 18:04:44.001017462 +0000 UTC m=+158.113824894" observedRunningTime="2026-04-16 18:05:33.163692941 +0000 UTC m=+207.276500395" watchObservedRunningTime="2026-04-16 18:05:33.165443354 +0000 UTC m=+207.278250808" Apr 16 18:05:33.212408 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.212382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6g6wf" Apr 16 18:05:33.267875 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.267846 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.267902 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268020 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268076 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268102 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268164 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268211 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdls6\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6\") pod \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\" (UID: \"3e36eb57-6fbb-4969-815e-9ff1adf4c8f4\") " Apr 16 18:05:33.268247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268240 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:33.268638 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268354 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:33.268692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268651 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-certificates\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.268692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268652 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:33.268692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.268675 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-trusted-ca\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.270528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.270498 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:33.270664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.270641 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:33.270777 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.270753 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:33.270811 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.270763 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6" (OuterVolumeSpecName: "kube-api-access-tdls6") pod "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" (UID: "3e36eb57-6fbb-4969-815e-9ff1adf4c8f4"). InnerVolumeSpecName "kube-api-access-tdls6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:33.369534 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.369496 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-image-registry-private-configuration\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.369534 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.369535 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-installation-pull-secrets\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.369720 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.369551 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-ca-trust-extracted\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.369720 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.369565 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-bound-sa-token\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.369720 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.369583 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdls6\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-kube-api-access-tdls6\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:33.401239 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.401156 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-l4wg4"] Apr 16 18:05:33.405665 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.405645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:33.408002 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.407978 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:05:33.408114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.408076 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:05:33.408114 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.408080 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hd75p\"" Apr 16 18:05:33.414725 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.414702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-l4wg4"] Apr 16 18:05:33.470455 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.470423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxpb\" (UniqueName: \"kubernetes.io/projected/fa774bed-dcba-4e18-8dfe-ae8bef67b1d1-kube-api-access-dtxpb\") pod \"downloads-586b57c7b4-l4wg4\" (UID: \"fa774bed-dcba-4e18-8dfe-ae8bef67b1d1\") " pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:33.571243 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.571210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxpb\" (UniqueName: \"kubernetes.io/projected/fa774bed-dcba-4e18-8dfe-ae8bef67b1d1-kube-api-access-dtxpb\") pod \"downloads-586b57c7b4-l4wg4\" (UID: \"fa774bed-dcba-4e18-8dfe-ae8bef67b1d1\") " pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:33.584091 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.584056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxpb\" (UniqueName: \"kubernetes.io/projected/fa774bed-dcba-4e18-8dfe-ae8bef67b1d1-kube-api-access-dtxpb\") pod \"downloads-586b57c7b4-l4wg4\" (UID: \"fa774bed-dcba-4e18-8dfe-ae8bef67b1d1\") " pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:33.716081 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.715979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:33.869037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:33.868984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-l4wg4"] Apr 16 18:05:33.871061 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:33.871031 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa774bed_dcba_4e18_8dfe_ae8bef67b1d1.slice/crio-caca2e3223d0850a6a1f6efc6607b084fe653d67cd886ff5010d878eab719fac WatchSource:0}: Error finding container caca2e3223d0850a6a1f6efc6607b084fe653d67cd886ff5010d878eab719fac: Status 404 returned error can't find the container with id caca2e3223d0850a6a1f6efc6607b084fe653d67cd886ff5010d878eab719fac Apr 16 18:05:34.149796 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.149756 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-l4wg4" event={"ID":"fa774bed-dcba-4e18-8dfe-ae8bef67b1d1","Type":"ContainerStarted","Data":"caca2e3223d0850a6a1f6efc6607b084fe653d67cd886ff5010d878eab719fac"} Apr 16 18:05:34.149796 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.149774 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85dc8575d4-v85pb" Apr 16 18:05:34.183402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.183357 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85dc8575d4-v85pb"] Apr 16 18:05:34.186560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.186537 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85dc8575d4-v85pb"] Apr 16 18:05:34.282906 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.282874 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4-registry-tls\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:05:34.477805 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:34.477722 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e36eb57-6fbb-4969-815e-9ff1adf4c8f4" path="/var/lib/kubelet/pods/3e36eb57-6fbb-4969-815e-9ff1adf4c8f4/volumes" Apr 16 18:05:44.076835 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.076798 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:05:44.081531 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.081502 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.084046 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.084020 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:05:44.084187 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.084146 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:05:44.084704 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.084686 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:05:44.085074 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.085052 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:05:44.085171 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.085052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:05:44.085400 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.085361 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-gjpks\"" Apr 16 18:05:44.094562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.094534 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:05:44.175460 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.175661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.175661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.175661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.175661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npn8f\" (UniqueName: \"kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.175880 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.175710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.276745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.276937 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.276937 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.276937 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.276937 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npn8f\" (UniqueName: \"kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.277153 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.276964 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.277565 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.277529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.277696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.277638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.277696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.277638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.279682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.279657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.279682 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.279673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.286631 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.286582 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npn8f\" (UniqueName: \"kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f\") pod \"console-76c585b9d9-scv87\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:44.393186 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:44.393084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:05:46.513456 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.513415 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:05:46.517711 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.517686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.526047 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.526019 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:05:46.527153 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.527095 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:05:46.699494 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699692 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7zv\" (UniqueName: \"kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699828 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699828 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.699925 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.699866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.800984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.800886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.800984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.800959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801217 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.800989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801217 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801319 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801409 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801409 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md7zv\" (UniqueName: \"kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801733 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.801830 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.801783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.802046 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.802013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.802126 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.802081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.803671 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.803651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.804018 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.803998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.810778 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.810741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7zv\" (UniqueName: \"kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv\") pod \"console-bccbff784-n5mgp\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:46.830669 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:46.830624 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:50.858407 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:50.858330 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:05:50.861073 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:50.861031 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf61734_e908_4e22_98a7_c0659cd49281.slice/crio-257932925dc08b1afbec3adeb849610e0ab20663a4b68c1dfc43ed34350975d5 WatchSource:0}: Error finding container 257932925dc08b1afbec3adeb849610e0ab20663a4b68c1dfc43ed34350975d5: Status 404 returned error can't find the container with id 257932925dc08b1afbec3adeb849610e0ab20663a4b68c1dfc43ed34350975d5 Apr 16 18:05:50.876524 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:50.876483 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:05:50.879216 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:05:50.879187 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862e4c2a_87b6_48ef_bd3d_625c7b6ee95b.slice/crio-321175c0a061042e17cb978328c30b5a8c649fdab52e1441429cf338dc13102f WatchSource:0}: Error finding container 321175c0a061042e17cb978328c30b5a8c649fdab52e1441429cf338dc13102f: Status 404 returned error can't find the container with id 321175c0a061042e17cb978328c30b5a8c649fdab52e1441429cf338dc13102f Apr 16 18:05:51.209585 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.209552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c585b9d9-scv87" event={"ID":"9cf61734-e908-4e22-98a7-c0659cd49281","Type":"ContainerStarted","Data":"257932925dc08b1afbec3adeb849610e0ab20663a4b68c1dfc43ed34350975d5"} Apr 16 18:05:51.211092 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.211065 2577 generic.go:358] "Generic (PLEG): container finished" podID="71948437-c3bf-419e-8170-14db67f520f4" containerID="baf1fcb4ca5151743b1cb84e1ada675ea681dfd04544ed79c18d7f9b51e92aa8" exitCode=0 Apr 16 18:05:51.211224 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.211144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" event={"ID":"71948437-c3bf-419e-8170-14db67f520f4","Type":"ContainerDied","Data":"baf1fcb4ca5151743b1cb84e1ada675ea681dfd04544ed79c18d7f9b51e92aa8"} Apr 16 18:05:51.211888 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.211582 2577 scope.go:117] "RemoveContainer" containerID="baf1fcb4ca5151743b1cb84e1ada675ea681dfd04544ed79c18d7f9b51e92aa8" Apr 16 18:05:51.212794 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.212767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccbff784-n5mgp" event={"ID":"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b","Type":"ContainerStarted","Data":"321175c0a061042e17cb978328c30b5a8c649fdab52e1441429cf338dc13102f"} Apr 16 18:05:51.214288 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.214262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-l4wg4" event={"ID":"fa774bed-dcba-4e18-8dfe-ae8bef67b1d1","Type":"ContainerStarted","Data":"36dc98b35c28f3d7697fd917e390c26595e405a49900c06d1cb3debf79f91aa4"} Apr 16 18:05:51.214871 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.214766 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:51.223654 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.223625 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-l4wg4" Apr 16 18:05:51.244420 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:51.244351 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-l4wg4" podStartSLOduration=1.315435217 podStartE2EDuration="18.244330696s" podCreationTimestamp="2026-04-16 18:05:33 +0000 UTC" firstStartedPulling="2026-04-16 18:05:33.872906996 +0000 UTC m=+207.985714428" lastFinishedPulling="2026-04-16 18:05:50.801802472 +0000 UTC m=+224.914609907" observedRunningTime="2026-04-16 18:05:51.24326731 +0000 UTC m=+225.356074766" watchObservedRunningTime="2026-04-16 18:05:51.244330696 +0000 UTC m=+225.357138152" Apr 16 18:05:52.223355 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:52.222872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-zrbs6" event={"ID":"71948437-c3bf-419e-8170-14db67f520f4","Type":"ContainerStarted","Data":"cb4b13f5bdac40f4768d12fa0c02145500624a1f34e18b3f39a458d4afb16960"} Apr 16 18:05:55.235325 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:55.235286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c585b9d9-scv87" event={"ID":"9cf61734-e908-4e22-98a7-c0659cd49281","Type":"ContainerStarted","Data":"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557"} Apr 16 18:05:55.237126 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:55.237084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccbff784-n5mgp" event={"ID":"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b","Type":"ContainerStarted","Data":"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6"} Apr 16 18:05:55.256625 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:55.256563 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c585b9d9-scv87" podStartSLOduration=7.354245013 podStartE2EDuration="11.256539554s" podCreationTimestamp="2026-04-16 18:05:44 +0000 UTC" firstStartedPulling="2026-04-16 18:05:50.863211627 +0000 UTC m=+224.976019062" lastFinishedPulling="2026-04-16 18:05:54.765506169 +0000 UTC m=+228.878313603" observedRunningTime="2026-04-16 18:05:55.254669023 +0000 UTC m=+229.367476489" watchObservedRunningTime="2026-04-16 18:05:55.256539554 +0000 UTC m=+229.369347008" Apr 16 18:05:55.276382 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:55.276318 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bccbff784-n5mgp" podStartSLOduration=5.398829642 podStartE2EDuration="9.276297488s" podCreationTimestamp="2026-04-16 18:05:46 +0000 UTC" firstStartedPulling="2026-04-16 18:05:50.881243232 +0000 UTC m=+224.994050664" lastFinishedPulling="2026-04-16 18:05:54.758711078 +0000 UTC m=+228.871518510" observedRunningTime="2026-04-16 18:05:55.274174088 +0000 UTC m=+229.386981566" watchObservedRunningTime="2026-04-16 18:05:55.276297488 +0000 UTC m=+229.389104942" Apr 16 18:05:56.831390 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:56.831325 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:56.831390 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:56.831400 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:56.836921 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:56.836891 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:57.249544 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:57.249462 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:05:57.300798 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:05:57.300764 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:06:00.257325 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:00.257285 2577 generic.go:358] "Generic (PLEG): container finished" podID="879e1a6e-7abe-4b70-9fdd-76b30b854006" containerID="74176636e1d6e9f4b810112bd3d17c43d3215d7b12b600ab3c0944971a071940" exitCode=0 Apr 16 18:06:00.257807 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:00.257360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" event={"ID":"879e1a6e-7abe-4b70-9fdd-76b30b854006","Type":"ContainerDied","Data":"74176636e1d6e9f4b810112bd3d17c43d3215d7b12b600ab3c0944971a071940"} Apr 16 18:06:00.257864 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:00.257842 2577 scope.go:117] "RemoveContainer" containerID="74176636e1d6e9f4b810112bd3d17c43d3215d7b12b600ab3c0944971a071940" Apr 16 18:06:01.262543 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:01.262506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-lbrtk" event={"ID":"879e1a6e-7abe-4b70-9fdd-76b30b854006","Type":"ContainerStarted","Data":"0e02077a1386adac78e973406c0710f8b66535528bbf372bec8d94a3cc2741f1"} Apr 16 18:06:04.393687 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:04.393649 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:06:05.274925 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:05.274892 2577 generic.go:358] "Generic (PLEG): container finished" podID="646dbece-2434-4ec6-ad3e-0f3009cde6a3" containerID="f4c7d9ae014e83c794bd07a0154c869bc6e24149b2b8432e2ddf7d8493b4e1dc" exitCode=0 Apr 16 18:06:05.275092 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:05.274930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" event={"ID":"646dbece-2434-4ec6-ad3e-0f3009cde6a3","Type":"ContainerDied","Data":"f4c7d9ae014e83c794bd07a0154c869bc6e24149b2b8432e2ddf7d8493b4e1dc"} Apr 16 18:06:05.275271 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:05.275254 2577 scope.go:117] "RemoveContainer" containerID="f4c7d9ae014e83c794bd07a0154c869bc6e24149b2b8432e2ddf7d8493b4e1dc" Apr 16 18:06:06.279811 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:06.279782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p68j9" event={"ID":"646dbece-2434-4ec6-ad3e-0f3009cde6a3","Type":"ContainerStarted","Data":"d36b9d0a92631959c4e7b27cd6c637e13df7ff93e9e27ac084f076d3b9a5f03c"} Apr 16 18:06:17.304016 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:17.303974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:06:17.306285 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:17.306264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f009e89a-5e15-4d47-81de-24ab98cb437b-metrics-certs\") pod \"network-metrics-daemon-znzwl\" (UID: \"f009e89a-5e15-4d47-81de-24ab98cb437b\") " pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:06:17.376526 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:17.376499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt7dn\"" Apr 16 18:06:17.385060 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:17.385040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-znzwl" Apr 16 18:06:17.507813 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:17.507782 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-znzwl"] Apr 16 18:06:17.511065 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:06:17.511037 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf009e89a_5e15_4d47_81de_24ab98cb437b.slice/crio-8c68690e3a40a44c37884fddd6c29d42f34eeb618dfd7b968580d4f05e5e272f WatchSource:0}: Error finding container 8c68690e3a40a44c37884fddd6c29d42f34eeb618dfd7b968580d4f05e5e272f: Status 404 returned error can't find the container with id 8c68690e3a40a44c37884fddd6c29d42f34eeb618dfd7b968580d4f05e5e272f Apr 16 18:06:18.315104 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:18.315068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-znzwl" event={"ID":"f009e89a-5e15-4d47-81de-24ab98cb437b","Type":"ContainerStarted","Data":"8c68690e3a40a44c37884fddd6c29d42f34eeb618dfd7b968580d4f05e5e272f"} Apr 16 18:06:19.319334 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:19.319301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-znzwl" event={"ID":"f009e89a-5e15-4d47-81de-24ab98cb437b","Type":"ContainerStarted","Data":"bacfc0a1adb263e96a82882edb66f8b133688aba77b1912264dec452e32e3c00"} Apr 16 18:06:20.323729 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:20.323647 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-znzwl" event={"ID":"f009e89a-5e15-4d47-81de-24ab98cb437b","Type":"ContainerStarted","Data":"770aeecf68b8279c3da0fb8c2af546aded89b5dfd25bfafcacccb61727631171"} Apr 16 18:06:20.339804 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:20.339756 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-znzwl" podStartSLOduration=252.823880274 podStartE2EDuration="4m14.339742017s" podCreationTimestamp="2026-04-16 18:02:06 +0000 UTC" firstStartedPulling="2026-04-16 18:06:17.513042824 +0000 UTC m=+251.625850257" lastFinishedPulling="2026-04-16 18:06:19.028904554 +0000 UTC m=+253.141712000" observedRunningTime="2026-04-16 18:06:20.338316761 +0000 UTC m=+254.451124216" watchObservedRunningTime="2026-04-16 18:06:20.339742017 +0000 UTC m=+254.452549471" Apr 16 18:06:22.323987 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.323947 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c585b9d9-scv87" podUID="9cf61734-e908-4e22-98a7-c0659cd49281" containerName="console" containerID="cri-o://4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557" gracePeriod=15 Apr 16 18:06:22.598721 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.598700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c585b9d9-scv87_9cf61734-e908-4e22-98a7-c0659cd49281/console/0.log" Apr 16 18:06:22.598828 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.598765 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:06:22.652085 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652055 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652085 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652092 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npn8f\" (UniqueName: \"kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652125 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652141 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652164 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652202 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert\") pod \"9cf61734-e908-4e22-98a7-c0659cd49281\" (UID: \"9cf61734-e908-4e22-98a7-c0659cd49281\") " Apr 16 18:06:22.652725 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652677 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config" (OuterVolumeSpecName: "console-config") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:22.652848 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652711 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca" (OuterVolumeSpecName: "service-ca") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:22.652848 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.652737 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:22.654401 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.654356 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:22.654740 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.654719 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f" (OuterVolumeSpecName: "kube-api-access-npn8f") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "kube-api-access-npn8f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:22.654819 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.654777 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9cf61734-e908-4e22-98a7-c0659cd49281" (UID: "9cf61734-e908-4e22-98a7-c0659cd49281"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:22.753697 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753659 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-console-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.753697 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753691 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-service-ca\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.753697 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753703 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-oauth-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.753960 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753716 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9cf61734-e908-4e22-98a7-c0659cd49281-oauth-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.753960 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753730 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cf61734-e908-4e22-98a7-c0659cd49281-console-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:22.753960 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:22.753741 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npn8f\" (UniqueName: \"kubernetes.io/projected/9cf61734-e908-4e22-98a7-c0659cd49281-kube-api-access-npn8f\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:23.335014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.334987 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c585b9d9-scv87_9cf61734-e908-4e22-98a7-c0659cd49281/console/0.log" Apr 16 18:06:23.335447 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.335030 2577 generic.go:358] "Generic (PLEG): container finished" podID="9cf61734-e908-4e22-98a7-c0659cd49281" containerID="4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557" exitCode=2 Apr 16 18:06:23.335447 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.335104 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c585b9d9-scv87" Apr 16 18:06:23.335447 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.335103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c585b9d9-scv87" event={"ID":"9cf61734-e908-4e22-98a7-c0659cd49281","Type":"ContainerDied","Data":"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557"} Apr 16 18:06:23.335447 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.335204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c585b9d9-scv87" event={"ID":"9cf61734-e908-4e22-98a7-c0659cd49281","Type":"ContainerDied","Data":"257932925dc08b1afbec3adeb849610e0ab20663a4b68c1dfc43ed34350975d5"} Apr 16 18:06:23.335447 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.335222 2577 scope.go:117] "RemoveContainer" containerID="4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557" Apr 16 18:06:23.347938 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.347916 2577 scope.go:117] "RemoveContainer" containerID="4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557" Apr 16 18:06:23.348191 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:23.348169 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557\": container with ID starting with 4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557 not found: ID does not exist" containerID="4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557" Apr 16 18:06:23.348246 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.348200 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557"} err="failed to get container status \"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557\": rpc error: code = NotFound desc = could not find container \"4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557\": container with ID starting with 4f458dae163f288d6ed49d4924841d61120ffe987d9c50dd61cbc3e4d8d4b557 not found: ID does not exist" Apr 16 18:06:23.357221 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.357200 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:06:23.360597 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:23.360575 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c585b9d9-scv87"] Apr 16 18:06:24.477453 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:24.477419 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf61734-e908-4e22-98a7-c0659cd49281" path="/var/lib/kubelet/pods/9cf61734-e908-4e22-98a7-c0659cd49281/volumes" Apr 16 18:06:41.167511 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.167479 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:06:41.168054 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.167856 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cf61734-e908-4e22-98a7-c0659cd49281" containerName="console" Apr 16 18:06:41.168054 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.167874 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf61734-e908-4e22-98a7-c0659cd49281" containerName="console" Apr 16 18:06:41.168054 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.167943 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cf61734-e908-4e22-98a7-c0659cd49281" containerName="console" Apr 16 18:06:41.173445 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.173404 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.182355 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.182328 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:06:41.210044 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22d67\" (UniqueName: \"kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.210258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.210260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311723 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22d67\" (UniqueName: \"kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311723 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.311979 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.311886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.312597 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.312562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.312722 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.312665 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.312765 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.312733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.312797 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.312737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.314308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.314277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.314481 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.314459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.320877 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.320854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22d67\" (UniqueName: \"kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67\") pod \"console-7cd57f7657-xs4zm\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.483986 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.483898 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:41.619500 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:41.619462 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:06:41.625441 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:06:41.625409 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb1fd9e_ee4e_41cd_bd7d_bfc774dacb41.slice/crio-c902e4edb4fca44c71e11fff78f636125bf83cd78c55b1617bc75ddf0884544e WatchSource:0}: Error finding container c902e4edb4fca44c71e11fff78f636125bf83cd78c55b1617bc75ddf0884544e: Status 404 returned error can't find the container with id c902e4edb4fca44c71e11fff78f636125bf83cd78c55b1617bc75ddf0884544e Apr 16 18:06:42.401120 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.401030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd57f7657-xs4zm" event={"ID":"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41","Type":"ContainerStarted","Data":"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8"} Apr 16 18:06:42.401120 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.401074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd57f7657-xs4zm" event={"ID":"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41","Type":"ContainerStarted","Data":"c902e4edb4fca44c71e11fff78f636125bf83cd78c55b1617bc75ddf0884544e"} Apr 16 18:06:42.420388 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.418467 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd57f7657-xs4zm" podStartSLOduration=1.418449012 podStartE2EDuration="1.418449012s" podCreationTimestamp="2026-04-16 18:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:42.415929144 +0000 UTC m=+276.528736597" watchObservedRunningTime="2026-04-16 18:06:42.418449012 +0000 UTC m=+276.531256466" Apr 16 18:06:42.736780 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.736689 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:42.737169 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737105 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="alertmanager" containerID="cri-o://4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" gracePeriod=120 Apr 16 18:06:42.737259 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737163 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-web" containerID="cri-o://0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" gracePeriod=120 Apr 16 18:06:42.737259 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737211 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-metric" containerID="cri-o://44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" gracePeriod=120 Apr 16 18:06:42.737259 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737248 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy" containerID="cri-o://2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" gracePeriod=120 Apr 16 18:06:42.737460 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737200 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="config-reloader" containerID="cri-o://dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" gracePeriod=120 Apr 16 18:06:42.737460 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:42.737212 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="prom-label-proxy" containerID="cri-o://89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" gracePeriod=120 Apr 16 18:06:43.406667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406630 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" exitCode=0 Apr 16 18:06:43.406667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406656 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" exitCode=0 Apr 16 18:06:43.406667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406664 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" exitCode=0 Apr 16 18:06:43.406667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406672 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" exitCode=0 Apr 16 18:06:43.407149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517"} Apr 16 18:06:43.407149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8"} Apr 16 18:06:43.407149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e"} Apr 16 18:06:43.407149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.406752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00"} Apr 16 18:06:43.991212 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:43.991189 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.039136 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039094 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039136 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039143 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039161 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039180 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039284 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039337 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6vr\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039392 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039421 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039454 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039488 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039549 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039550 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039575 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039641 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.039673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039668 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume\") pod \"9c80e072-3897-4516-ba1c-74201dec58e9\" (UID: \"9c80e072-3897-4516-ba1c-74201dec58e9\") " Apr 16 18:06:44.040063 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039978 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-metrics-client-ca\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.040063 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.039998 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.041204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.041151 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:44.043602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.043482 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.043602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.043557 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out" (OuterVolumeSpecName: "config-out") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:44.043770 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.043678 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.044215 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.044188 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.045189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.045153 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr" (OuterVolumeSpecName: "kube-api-access-2d6vr") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "kube-api-access-2d6vr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:44.045309 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.045178 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.045309 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.045244 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.045587 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.045558 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:44.048237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.048180 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.055424 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.055395 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config" (OuterVolumeSpecName: "web-config") pod "9c80e072-3897-4516-ba1c-74201dec58e9" (UID: "9c80e072-3897-4516-ba1c-74201dec58e9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:44.140778 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140740 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.140778 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140772 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-config-volume\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.140778 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140784 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140795 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-cluster-tls-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140804 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-tls-assets\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140813 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2d6vr\" (UniqueName: \"kubernetes.io/projected/9c80e072-3897-4516-ba1c-74201dec58e9-kube-api-access-2d6vr\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140822 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-main-tls\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140833 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-config-out\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140842 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9c80e072-3897-4516-ba1c-74201dec58e9-alertmanager-main-db\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140852 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.141011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.140861 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9c80e072-3897-4516-ba1c-74201dec58e9-web-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:06:44.412572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412539 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" exitCode=0 Apr 16 18:06:44.412572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412567 2577 generic.go:358] "Generic (PLEG): container finished" podID="9c80e072-3897-4516-ba1c-74201dec58e9" containerID="0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" exitCode=0 Apr 16 18:06:44.413039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08"} Apr 16 18:06:44.413039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46"} Apr 16 18:06:44.413039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9c80e072-3897-4516-ba1c-74201dec58e9","Type":"ContainerDied","Data":"4fea2379db4edff60a9edf5137d8a2718cddd8dd7cbf2f08a00aa39057d5249d"} Apr 16 18:06:44.413039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412657 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.413039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.412734 2577 scope.go:117] "RemoveContainer" containerID="89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" Apr 16 18:06:44.420048 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.420033 2577 scope.go:117] "RemoveContainer" containerID="44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" Apr 16 18:06:44.426729 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.426712 2577 scope.go:117] "RemoveContainer" containerID="2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" Apr 16 18:06:44.433352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.433331 2577 scope.go:117] "RemoveContainer" containerID="0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" Apr 16 18:06:44.437602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.437577 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:44.441474 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.441454 2577 scope.go:117] "RemoveContainer" containerID="dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" Apr 16 18:06:44.442603 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.442578 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:44.448602 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.448578 2577 scope.go:117] "RemoveContainer" containerID="4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" Apr 16 18:06:44.455511 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.455493 2577 scope.go:117] "RemoveContainer" containerID="018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4" Apr 16 18:06:44.462323 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.462303 2577 scope.go:117] "RemoveContainer" containerID="89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" Apr 16 18:06:44.462641 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.462617 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517\": container with ID starting with 89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517 not found: ID does not exist" containerID="89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" Apr 16 18:06:44.462707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.462651 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517"} err="failed to get container status \"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517\": rpc error: code = NotFound desc = could not find container \"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517\": container with ID starting with 89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517 not found: ID does not exist" Apr 16 18:06:44.462707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.462673 2577 scope.go:117] "RemoveContainer" containerID="44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" Apr 16 18:06:44.462939 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.462922 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08\": container with ID starting with 44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08 not found: ID does not exist" containerID="44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" Apr 16 18:06:44.462978 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.462953 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08"} err="failed to get container status \"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08\": rpc error: code = NotFound desc = could not find container \"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08\": container with ID starting with 44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08 not found: ID does not exist" Apr 16 18:06:44.462978 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.462970 2577 scope.go:117] "RemoveContainer" containerID="2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" Apr 16 18:06:44.463188 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.463171 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8\": container with ID starting with 2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8 not found: ID does not exist" containerID="2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" Apr 16 18:06:44.463226 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463193 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8"} err="failed to get container status \"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8\": rpc error: code = NotFound desc = could not find container \"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8\": container with ID starting with 2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8 not found: ID does not exist" Apr 16 18:06:44.463226 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463207 2577 scope.go:117] "RemoveContainer" containerID="0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" Apr 16 18:06:44.463484 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.463465 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46\": container with ID starting with 0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46 not found: ID does not exist" containerID="0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" Apr 16 18:06:44.463535 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463488 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46"} err="failed to get container status \"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46\": rpc error: code = NotFound desc = could not find container \"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46\": container with ID starting with 0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46 not found: ID does not exist" Apr 16 18:06:44.463535 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463503 2577 scope.go:117] "RemoveContainer" containerID="dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" Apr 16 18:06:44.463735 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.463720 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e\": container with ID starting with dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e not found: ID does not exist" containerID="dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" Apr 16 18:06:44.463785 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463737 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e"} err="failed to get container status \"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e\": rpc error: code = NotFound desc = could not find container \"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e\": container with ID starting with dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e not found: ID does not exist" Apr 16 18:06:44.463785 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463751 2577 scope.go:117] "RemoveContainer" containerID="4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" Apr 16 18:06:44.463969 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.463953 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00\": container with ID starting with 4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00 not found: ID does not exist" containerID="4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" Apr 16 18:06:44.464022 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463975 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00"} err="failed to get container status \"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00\": rpc error: code = NotFound desc = could not find container \"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00\": container with ID starting with 4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00 not found: ID does not exist" Apr 16 18:06:44.464022 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.463993 2577 scope.go:117] "RemoveContainer" containerID="018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4" Apr 16 18:06:44.464238 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.464217 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4\": container with ID starting with 018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4 not found: ID does not exist" containerID="018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4" Apr 16 18:06:44.464238 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464241 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4"} err="failed to get container status \"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4\": rpc error: code = NotFound desc = could not find container \"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4\": container with ID starting with 018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4 not found: ID does not exist" Apr 16 18:06:44.464418 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464255 2577 scope.go:117] "RemoveContainer" containerID="89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517" Apr 16 18:06:44.464558 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464503 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517"} err="failed to get container status \"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517\": rpc error: code = NotFound desc = could not find container \"89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517\": container with ID starting with 89bbe1760414144da0e662518c37799c848eea4c373a7cff26012fff07281517 not found: ID does not exist" Apr 16 18:06:44.464558 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464528 2577 scope.go:117] "RemoveContainer" containerID="44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08" Apr 16 18:06:44.464779 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464762 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08"} err="failed to get container status \"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08\": rpc error: code = NotFound desc = could not find container \"44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08\": container with ID starting with 44dba8abdb6a58898f64305b3cb6e0d00374d12759005f268cdeb85bf569ee08 not found: ID does not exist" Apr 16 18:06:44.464838 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.464780 2577 scope.go:117] "RemoveContainer" containerID="2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8" Apr 16 18:06:44.465021 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465001 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8"} err="failed to get container status \"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8\": rpc error: code = NotFound desc = could not find container \"2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8\": container with ID starting with 2ad7eca9ae9d21032fe87166f156d9fc119ee203092062a4899d8b87490cf5f8 not found: ID does not exist" Apr 16 18:06:44.465021 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465020 2577 scope.go:117] "RemoveContainer" containerID="0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46" Apr 16 18:06:44.465237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465220 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46"} err="failed to get container status \"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46\": rpc error: code = NotFound desc = could not find container \"0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46\": container with ID starting with 0146dc31c0c8bdf5fe890e7aa6ce9bc942fc40dbfe7c627d649c876fdc5bec46 not found: ID does not exist" Apr 16 18:06:44.465291 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465238 2577 scope.go:117] "RemoveContainer" containerID="dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e" Apr 16 18:06:44.465515 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465491 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e"} err="failed to get container status \"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e\": rpc error: code = NotFound desc = could not find container \"dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e\": container with ID starting with dd1523381cedd880a61909935fff1abe9ada5c55a7503b9686313b636aad608e not found: ID does not exist" Apr 16 18:06:44.465579 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465516 2577 scope.go:117] "RemoveContainer" containerID="4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00" Apr 16 18:06:44.465713 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465695 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00"} err="failed to get container status \"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00\": rpc error: code = NotFound desc = could not find container \"4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00\": container with ID starting with 4ae8dafbb186770339f5aabdb0b9e389520da95bd28de59ec217f99638dd7f00 not found: ID does not exist" Apr 16 18:06:44.465754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465714 2577 scope.go:117] "RemoveContainer" containerID="018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4" Apr 16 18:06:44.465918 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.465903 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4"} err="failed to get container status \"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4\": rpc error: code = NotFound desc = could not find container \"018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4\": container with ID starting with 018dc4daa7fcd0355c3cb3a058e2351e520ff9fde0b5ad635273bdcc714d68f4 not found: ID does not exist" Apr 16 18:06:44.477418 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.477383 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" path="/var/lib/kubelet/pods/9c80e072-3897-4516-ba1c-74201dec58e9/volumes" Apr 16 18:06:44.477897 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.477883 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:44.478176 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478164 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="init-config-reloader" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478178 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="init-config-reloader" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478187 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="prom-label-proxy" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478192 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="prom-label-proxy" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478202 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="config-reloader" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478208 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="config-reloader" Apr 16 18:06:44.478214 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478215 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-metric" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478221 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-metric" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478229 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-web" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478233 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-web" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478239 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478244 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478252 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="alertmanager" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478257 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="alertmanager" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478304 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478313 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="alertmanager" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478321 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-web" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478328 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="config-reloader" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478335 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="kube-rbac-proxy-metric" Apr 16 18:06:44.478410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.478343 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c80e072-3897-4516-ba1c-74201dec58e9" containerName="prom-label-proxy" Apr 16 18:06:44.483691 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.483674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.485900 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.485876 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:06:44.485900 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.485899 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:06:44.486149 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-n8zmp\"" Apr 16 18:06:44.486229 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486186 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:06:44.486282 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:06:44.486282 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486237 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:06:44.486557 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:06:44.486632 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.486538 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:06:44.487028 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.487015 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:06:44.492750 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.492729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:06:44.495900 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.495852 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-web-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544811 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxcg\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-kube-api-access-djxcg\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544915 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.544982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.545073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.545123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.545178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.545209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.548566 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.545244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-config-out\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646307 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646469 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646469 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646469 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-config-out\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646616 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-web-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646616 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646719 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646719 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djxcg\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-kube-api-access-djxcg\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646719 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646859 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646859 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646859 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.646996 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.646877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.648219 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.648175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.648344 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.648312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.648841 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.648815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16a93748-4af4-4427-b39e-657b0f9ac96b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.649449 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.649422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.649544 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.649516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.649624 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.649551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.649782 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.649751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.650353 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.650329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-web-config\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.650586 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.650566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16a93748-4af4-4427-b39e-657b0f9ac96b-config-out\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.650586 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.650576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.650906 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.650888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.650991 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.650975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16a93748-4af4-4427-b39e-657b0f9ac96b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.655947 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.655929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxcg\" (UniqueName: \"kubernetes.io/projected/16a93748-4af4-4427-b39e-657b0f9ac96b-kube-api-access-djxcg\") pod \"alertmanager-main-0\" (UID: \"16a93748-4af4-4427-b39e-657b0f9ac96b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.794168 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.794088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:06:44.922734 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.922692 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-77v8t" podUID="6429bf79-1554-458a-8ed2-de631c73ca89" Apr 16 18:06:44.922886 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:06:44.922692 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8rjlz" podUID="176aef22-2713-42bf-81d6-9602a79bf10f" Apr 16 18:06:44.934440 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:44.934265 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:06:44.937096 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:06:44.937070 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a93748_4af4_4427_b39e_657b0f9ac96b.slice/crio-ee137bacb0765ea1fc4c35c005b204c4c4f39c6e80f8aa7f2b1fd95f5c45fb71 WatchSource:0}: Error finding container ee137bacb0765ea1fc4c35c005b204c4c4f39c6e80f8aa7f2b1fd95f5c45fb71: Status 404 returned error can't find the container with id ee137bacb0765ea1fc4c35c005b204c4c4f39c6e80f8aa7f2b1fd95f5c45fb71 Apr 16 18:06:45.418638 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:45.418599 2577 generic.go:358] "Generic (PLEG): container finished" podID="16a93748-4af4-4427-b39e-657b0f9ac96b" containerID="a9787e4df614fa2522b31df2bb4bb17c2bd122f27d010f411c13489a7fb9a9a1" exitCode=0 Apr 16 18:06:45.419014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:45.418683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerDied","Data":"a9787e4df614fa2522b31df2bb4bb17c2bd122f27d010f411c13489a7fb9a9a1"} Apr 16 18:06:45.419014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:45.418718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"ee137bacb0765ea1fc4c35c005b204c4c4f39c6e80f8aa7f2b1fd95f5c45fb71"} Apr 16 18:06:45.419014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:45.418740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77v8t" Apr 16 18:06:45.419014 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:45.418764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:06:46.425517 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"142eb1dd1ebed295630024dc4fab8d8704ebe7cafc76d32aec2bcafb3b11851f"} Apr 16 18:06:46.425517 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"ff773cf3c250053b2e238803b0f0aeee6c0f47e35b916d53e74ef2aa4d388cb3"} Apr 16 18:06:46.426015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425530 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"3d31aaef154596f89f6d9c475b040a655c4228ab40b88b22d23b2c28067db91d"} Apr 16 18:06:46.426015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"a47821866403cb3113176ce7587452fd8c86c33dd485ad5f32da70f5d5b20c2f"} Apr 16 18:06:46.426015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"37d9c0b755ff1b6a00733f040577f0791f573a03ddc5469384aa14c4daf580d9"} Apr 16 18:06:46.426015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.425577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16a93748-4af4-4427-b39e-657b0f9ac96b","Type":"ContainerStarted","Data":"ff2ba9cdb13ea074a36e57404d94fb9855d1c78f3b9bc9f6c82b37aa4a9148bc"} Apr 16 18:06:46.453411 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:46.453345 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.453320067 podStartE2EDuration="2.453320067s" podCreationTimestamp="2026-04-16 18:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:46.451322403 +0000 UTC m=+280.564129857" watchObservedRunningTime="2026-04-16 18:06:46.453320067 +0000 UTC m=+280.566127521" Apr 16 18:06:48.888202 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:48.888161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:06:48.888708 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:48.888224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:06:48.890616 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:48.890590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6429bf79-1554-458a-8ed2-de631c73ca89-metrics-tls\") pod \"dns-default-77v8t\" (UID: \"6429bf79-1554-458a-8ed2-de631c73ca89\") " pod="openshift-dns/dns-default-77v8t" Apr 16 18:06:48.890688 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:48.890648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/176aef22-2713-42bf-81d6-9602a79bf10f-cert\") pod \"ingress-canary-8rjlz\" (UID: \"176aef22-2713-42bf-81d6-9602a79bf10f\") " pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:06:49.021303 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.021266 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dxq2x\"" Apr 16 18:06:49.021704 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.021685 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-j5p9z\"" Apr 16 18:06:49.030059 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.030031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjlz" Apr 16 18:06:49.030184 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.030127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-77v8t" Apr 16 18:06:49.157361 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.157284 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-77v8t"] Apr 16 18:06:49.161765 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:06:49.161733 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6429bf79_1554_458a_8ed2_de631c73ca89.slice/crio-07435d8efd784872e3ae0e8ce3a27daa397d4199a5004822007ec31d9578a300 WatchSource:0}: Error finding container 07435d8efd784872e3ae0e8ce3a27daa397d4199a5004822007ec31d9578a300: Status 404 returned error can't find the container with id 07435d8efd784872e3ae0e8ce3a27daa397d4199a5004822007ec31d9578a300 Apr 16 18:06:49.184113 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.184088 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjlz"] Apr 16 18:06:49.186417 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:06:49.186359 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod176aef22_2713_42bf_81d6_9602a79bf10f.slice/crio-708f4f4f6e931201b816063a1346de9a3a52a275f8bd39244c4dd837fbf4ab1c WatchSource:0}: Error finding container 708f4f4f6e931201b816063a1346de9a3a52a275f8bd39244c4dd837fbf4ab1c: Status 404 returned error can't find the container with id 708f4f4f6e931201b816063a1346de9a3a52a275f8bd39244c4dd837fbf4ab1c Apr 16 18:06:49.436553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.436455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8rjlz" event={"ID":"176aef22-2713-42bf-81d6-9602a79bf10f","Type":"ContainerStarted","Data":"708f4f4f6e931201b816063a1346de9a3a52a275f8bd39244c4dd837fbf4ab1c"} Apr 16 18:06:49.437571 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:49.437545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77v8t" event={"ID":"6429bf79-1554-458a-8ed2-de631c73ca89","Type":"ContainerStarted","Data":"07435d8efd784872e3ae0e8ce3a27daa397d4199a5004822007ec31d9578a300"} Apr 16 18:06:51.446980 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.446880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8rjlz" event={"ID":"176aef22-2713-42bf-81d6-9602a79bf10f","Type":"ContainerStarted","Data":"3fbe71763c2f72b5b8f0b31c22c9b8c19e97878823483729cec98b334d0effb9"} Apr 16 18:06:51.448520 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.448496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77v8t" event={"ID":"6429bf79-1554-458a-8ed2-de631c73ca89","Type":"ContainerStarted","Data":"757255b297d41c428fb47dbb6865f0511e65ebd949a899d7e3080daaa5dad210"} Apr 16 18:06:51.448633 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.448524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-77v8t" event={"ID":"6429bf79-1554-458a-8ed2-de631c73ca89","Type":"ContainerStarted","Data":"14c5e64b1e6139ce760f5647d13ee9ca8285276b63eaa18c92412cf557a5f470"} Apr 16 18:06:51.448633 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.448546 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-77v8t" Apr 16 18:06:51.469387 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.465846 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8rjlz" podStartSLOduration=251.739964137 podStartE2EDuration="4m13.465829655s" podCreationTimestamp="2026-04-16 18:02:38 +0000 UTC" firstStartedPulling="2026-04-16 18:06:49.188206301 +0000 UTC m=+283.301013736" lastFinishedPulling="2026-04-16 18:06:50.914071822 +0000 UTC m=+285.026879254" observedRunningTime="2026-04-16 18:06:51.464259759 +0000 UTC m=+285.577067212" watchObservedRunningTime="2026-04-16 18:06:51.465829655 +0000 UTC m=+285.578637109" Apr 16 18:06:51.484770 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.484727 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:51.484915 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.484789 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:51.490612 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.490586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:51.501327 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:51.501270 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-77v8t" podStartSLOduration=251.754488813 podStartE2EDuration="4m13.501252166s" podCreationTimestamp="2026-04-16 18:02:38 +0000 UTC" firstStartedPulling="2026-04-16 18:06:49.163810037 +0000 UTC m=+283.276617470" lastFinishedPulling="2026-04-16 18:06:50.910573376 +0000 UTC m=+285.023380823" observedRunningTime="2026-04-16 18:06:51.500872905 +0000 UTC m=+285.613680365" watchObservedRunningTime="2026-04-16 18:06:51.501252166 +0000 UTC m=+285.614059622" Apr 16 18:06:52.455606 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:52.455574 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:06:52.512061 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:06:52.512022 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:07:01.454034 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:01.454002 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-77v8t" Apr 16 18:07:06.353462 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:06.353439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:07:06.353796 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:06.353527 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:07:06.361721 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:06.361703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:07:06.361820 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:06.361793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:07:06.365161 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:06.365144 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:07:17.532949 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.532886 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bccbff784-n5mgp" podUID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" containerName="console" containerID="cri-o://2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6" gracePeriod=15 Apr 16 18:07:17.768208 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.768186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bccbff784-n5mgp_862e4c2a-87b6-48ef-bd3d-625c7b6ee95b/console/0.log" Apr 16 18:07:17.768336 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.768253 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:07:17.838013 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.837979 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838028 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838081 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838131 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838204 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838154 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md7zv\" (UniqueName: \"kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838423 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838215 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838423 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838250 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca\") pod \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\" (UID: \"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b\") " Apr 16 18:07:17.838546 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838517 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config" (OuterVolumeSpecName: "console-config") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:17.838637 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838610 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:17.838738 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838710 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca" (OuterVolumeSpecName: "service-ca") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:17.838807 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.838796 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:17.840429 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.840337 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:17.840555 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.840431 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv" (OuterVolumeSpecName: "kube-api-access-md7zv") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "kube-api-access-md7zv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:07:17.840555 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.840468 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" (UID: "862e4c2a-87b6-48ef-bd3d-625c7b6ee95b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:17.939432 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939389 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-oauth-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939432 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939423 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-trusted-ca-bundle\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939432 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939433 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-md7zv\" (UniqueName: \"kubernetes.io/projected/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-kube-api-access-md7zv\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939432 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939442 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939701 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939451 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-service-ca\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939701 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939460 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-console-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:17.939701 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:17.939468 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b-oauth-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:07:18.536092 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bccbff784-n5mgp_862e4c2a-87b6-48ef-bd3d-625c7b6ee95b/console/0.log" Apr 16 18:07:18.536528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536109 2577 generic.go:358] "Generic (PLEG): container finished" podID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" containerID="2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6" exitCode=2 Apr 16 18:07:18.536528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536201 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccbff784-n5mgp" Apr 16 18:07:18.536528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536198 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccbff784-n5mgp" event={"ID":"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b","Type":"ContainerDied","Data":"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6"} Apr 16 18:07:18.536528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccbff784-n5mgp" event={"ID":"862e4c2a-87b6-48ef-bd3d-625c7b6ee95b","Type":"ContainerDied","Data":"321175c0a061042e17cb978328c30b5a8c649fdab52e1441429cf338dc13102f"} Apr 16 18:07:18.536528 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.536330 2577 scope.go:117] "RemoveContainer" containerID="2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6" Apr 16 18:07:18.545034 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.545015 2577 scope.go:117] "RemoveContainer" containerID="2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6" Apr 16 18:07:18.545325 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:07:18.545304 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6\": container with ID starting with 2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6 not found: ID does not exist" containerID="2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6" Apr 16 18:07:18.545382 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.545335 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6"} err="failed to get container status \"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6\": rpc error: code = NotFound desc = could not find container \"2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6\": container with ID starting with 2a01813ac54fca5ad202b1503584c6ac1f564d85af8111e0671c5d2146415cb6 not found: ID does not exist" Apr 16 18:07:18.552059 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.552026 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:07:18.556929 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:18.556898 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bccbff784-n5mgp"] Apr 16 18:07:20.477518 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:07:20.477475 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" path="/var/lib/kubelet/pods/862e4c2a-87b6-48ef-bd3d-625c7b6ee95b/volumes" Apr 16 18:10:47.810838 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.810801 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55c796ccb9-hvjzr"] Apr 16 18:10:47.811362 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.811180 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" containerName="console" Apr 16 18:10:47.811362 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.811195 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" containerName="console" Apr 16 18:10:47.811362 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.811260 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="862e4c2a-87b6-48ef-bd3d-625c7b6ee95b" containerName="console" Apr 16 18:10:47.814421 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.814397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.835769 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.835740 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c796ccb9-hvjzr"] Apr 16 18:10:47.880246 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880246 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880245 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-service-ca\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880483 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-oauth-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880483 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880483 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-trusted-ca-bundle\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880483 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880445 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-oauth-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.880631 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.880508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqlr\" (UniqueName: \"kubernetes.io/projected/f8d65037-b15a-4ca6-ab9c-0d31940528fe-kube-api-access-brqlr\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-oauth-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981502 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brqlr\" (UniqueName: \"kubernetes.io/projected/f8d65037-b15a-4ca6-ab9c-0d31940528fe-kube-api-access-brqlr\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981762 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981705 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981762 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-service-ca\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981869 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-oauth-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981869 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.981966 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.981868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-trusted-ca-bundle\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.982578 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.982554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-oauth-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.982683 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.982607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.982683 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.982610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-service-ca\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.982683 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.982643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d65037-b15a-4ca6-ab9c-0d31940528fe-trusted-ca-bundle\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.984068 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.984045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-oauth-config\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.984191 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.984174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d65037-b15a-4ca6-ab9c-0d31940528fe-console-serving-cert\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:47.989445 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:47.989419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqlr\" (UniqueName: \"kubernetes.io/projected/f8d65037-b15a-4ca6-ab9c-0d31940528fe-kube-api-access-brqlr\") pod \"console-55c796ccb9-hvjzr\" (UID: \"f8d65037-b15a-4ca6-ab9c-0d31940528fe\") " pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:48.123579 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:48.123541 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:48.255667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:48.255641 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55c796ccb9-hvjzr"] Apr 16 18:10:48.258400 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:10:48.258351 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d65037_b15a_4ca6_ab9c_0d31940528fe.slice/crio-d63f481e4eb33e4437b2a37777882e1e4e2ab6971157496983f46be2d68a9b11 WatchSource:0}: Error finding container d63f481e4eb33e4437b2a37777882e1e4e2ab6971157496983f46be2d68a9b11: Status 404 returned error can't find the container with id d63f481e4eb33e4437b2a37777882e1e4e2ab6971157496983f46be2d68a9b11 Apr 16 18:10:48.260402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:48.260357 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:49.170430 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:49.170392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c796ccb9-hvjzr" event={"ID":"f8d65037-b15a-4ca6-ab9c-0d31940528fe","Type":"ContainerStarted","Data":"d96c87ebd8fe2aec61552b5dcb5bd95e72e189789f7d5aa7e7b1ebca526098d8"} Apr 16 18:10:49.170430 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:49.170432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55c796ccb9-hvjzr" event={"ID":"f8d65037-b15a-4ca6-ab9c-0d31940528fe","Type":"ContainerStarted","Data":"d63f481e4eb33e4437b2a37777882e1e4e2ab6971157496983f46be2d68a9b11"} Apr 16 18:10:49.190021 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:49.189970 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55c796ccb9-hvjzr" podStartSLOduration=2.189954338 podStartE2EDuration="2.189954338s" podCreationTimestamp="2026-04-16 18:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:10:49.18888103 +0000 UTC m=+523.301688496" watchObservedRunningTime="2026-04-16 18:10:49.189954338 +0000 UTC m=+523.302761794" Apr 16 18:10:58.124539 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:58.124445 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:58.124539 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:58.124503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:58.128938 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:58.128914 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:58.202264 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:58.202239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55c796ccb9-hvjzr" Apr 16 18:10:58.265873 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:10:58.265843 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:11:23.285743 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.285682 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7cd57f7657-xs4zm" podUID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" containerName="console" containerID="cri-o://a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8" gracePeriod=15 Apr 16 18:11:23.533716 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.533692 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd57f7657-xs4zm_acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41/console/0.log" Apr 16 18:11:23.533852 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.533760 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:11:23.665294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665252 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665504 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665307 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665504 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665334 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22d67\" (UniqueName: \"kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665504 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665473 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665539 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665579 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665673 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665631 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle\") pod \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\" (UID: \"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41\") " Apr 16 18:11:23.665940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665869 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca" (OuterVolumeSpecName: "service-ca") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:23.666013 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665972 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config" (OuterVolumeSpecName: "console-config") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:23.666013 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.665985 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:23.666112 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.666080 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:11:23.667584 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.667551 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67" (OuterVolumeSpecName: "kube-api-access-22d67") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "kube-api-access-22d67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:23.667950 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.667920 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:23.667950 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.667931 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" (UID: "acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:23.766301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766263 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-service-ca\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766295 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766301 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766307 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766316 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-trusted-ca-bundle\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766325 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-oauth-serving-cert\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766334 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-console-oauth-config\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:23.766562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:23.766342 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22d67\" (UniqueName: \"kubernetes.io/projected/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41-kube-api-access-22d67\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:11:24.294290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd57f7657-xs4zm_acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41/console/0.log" Apr 16 18:11:24.294726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294302 2577 generic.go:358] "Generic (PLEG): container finished" podID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" containerID="a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8" exitCode=2 Apr 16 18:11:24.294726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd57f7657-xs4zm" event={"ID":"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41","Type":"ContainerDied","Data":"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8"} Apr 16 18:11:24.294726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294404 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd57f7657-xs4zm" Apr 16 18:11:24.294726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294440 2577 scope.go:117] "RemoveContainer" containerID="a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8" Apr 16 18:11:24.294726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.294431 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd57f7657-xs4zm" event={"ID":"acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41","Type":"ContainerDied","Data":"c902e4edb4fca44c71e11fff78f636125bf83cd78c55b1617bc75ddf0884544e"} Apr 16 18:11:24.302882 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.302863 2577 scope.go:117] "RemoveContainer" containerID="a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8" Apr 16 18:11:24.303141 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:11:24.303123 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8\": container with ID starting with a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8 not found: ID does not exist" containerID="a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8" Apr 16 18:11:24.303203 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.303149 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8"} err="failed to get container status \"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8\": rpc error: code = NotFound desc = could not find container \"a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8\": container with ID starting with a5dc628a3003ed61fae62e745492ed23d9c6b8f31f5c7e4d88287418a115aea8 not found: ID does not exist" Apr 16 18:11:24.316387 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.316348 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:11:24.319795 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.319774 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7cd57f7657-xs4zm"] Apr 16 18:11:24.477106 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:11:24.477075 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" path="/var/lib/kubelet/pods/acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41/volumes" Apr 16 18:12:04.264143 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.264106 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs"] Apr 16 18:12:04.264589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.264461 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" containerName="console" Apr 16 18:12:04.264589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.264474 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" containerName="console" Apr 16 18:12:04.264589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.264551 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="acb1fd9e-ee4e-41cd-bd7d-bfc774dacb41" containerName="console" Apr 16 18:12:04.267635 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.267616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.270574 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.270552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 18:12:04.271058 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.271039 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wcgmw\"" Apr 16 18:12:04.271140 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.271043 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 18:12:04.279282 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.279255 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs"] Apr 16 18:12:04.306679 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.306645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.306866 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.306714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzn2p\" (UniqueName: \"kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.306866 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.306759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.408183 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.408141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzn2p\" (UniqueName: \"kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.408183 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.408181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.408463 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.408225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.408626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.408605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.408697 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.408651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.418057 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.418031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzn2p\" (UniqueName: \"kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.576377 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.576346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:04.709657 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:04.709625 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs"] Apr 16 18:12:04.715838 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:12:04.715789 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2e1a455_a21f_4e77_ab56_eb7b94400f7f.slice/crio-bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f WatchSource:0}: Error finding container bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f: Status 404 returned error can't find the container with id bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f Apr 16 18:12:05.429323 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:05.429284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" event={"ID":"b2e1a455-a21f-4e77-ab56-eb7b94400f7f","Type":"ContainerStarted","Data":"bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f"} Apr 16 18:12:06.380158 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:06.380119 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:12:06.383274 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:06.383248 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:12:06.389789 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:06.389763 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:12:06.392878 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:06.392856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:12:10.450310 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:10.450271 2577 generic.go:358] "Generic (PLEG): container finished" podID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerID="f1b6b9324535e02640afe16cc40b40ed109757318d3611d1721d691dd281dc18" exitCode=0 Apr 16 18:12:10.450725 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:10.450362 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" event={"ID":"b2e1a455-a21f-4e77-ab56-eb7b94400f7f","Type":"ContainerDied","Data":"f1b6b9324535e02640afe16cc40b40ed109757318d3611d1721d691dd281dc18"} Apr 16 18:12:13.468450 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:13.468411 2577 generic.go:358] "Generic (PLEG): container finished" podID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerID="15ce7c6fefef15252b642b4ab9c809275da65366aef07f2a744662b69ca5dd07" exitCode=0 Apr 16 18:12:13.468923 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:13.468495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" event={"ID":"b2e1a455-a21f-4e77-ab56-eb7b94400f7f","Type":"ContainerDied","Data":"15ce7c6fefef15252b642b4ab9c809275da65366aef07f2a744662b69ca5dd07"} Apr 16 18:12:19.496026 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:19.495917 2577 generic.go:358] "Generic (PLEG): container finished" podID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerID="30d0808d55a2b8aa15ac7609afd0910b4d952ba9058a26a54f643ade30e0e898" exitCode=0 Apr 16 18:12:19.496026 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:19.496005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" event={"ID":"b2e1a455-a21f-4e77-ab56-eb7b94400f7f","Type":"ContainerDied","Data":"30d0808d55a2b8aa15ac7609afd0910b4d952ba9058a26a54f643ade30e0e898"} Apr 16 18:12:20.624138 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.624112 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:20.756909 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.756822 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle\") pod \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " Apr 16 18:12:20.756909 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.756879 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzn2p\" (UniqueName: \"kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p\") pod \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " Apr 16 18:12:20.757111 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.756932 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util\") pod \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\" (UID: \"b2e1a455-a21f-4e77-ab56-eb7b94400f7f\") " Apr 16 18:12:20.757543 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.757510 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle" (OuterVolumeSpecName: "bundle") pod "b2e1a455-a21f-4e77-ab56-eb7b94400f7f" (UID: "b2e1a455-a21f-4e77-ab56-eb7b94400f7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:20.759227 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.759208 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p" (OuterVolumeSpecName: "kube-api-access-nzn2p") pod "b2e1a455-a21f-4e77-ab56-eb7b94400f7f" (UID: "b2e1a455-a21f-4e77-ab56-eb7b94400f7f"). InnerVolumeSpecName "kube-api-access-nzn2p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:20.762013 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.761968 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util" (OuterVolumeSpecName: "util") pod "b2e1a455-a21f-4e77-ab56-eb7b94400f7f" (UID: "b2e1a455-a21f-4e77-ab56-eb7b94400f7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:20.858188 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.858146 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-bundle\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:12:20.858188 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.858181 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzn2p\" (UniqueName: \"kubernetes.io/projected/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-kube-api-access-nzn2p\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:12:20.858188 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:20.858193 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2e1a455-a21f-4e77-ab56-eb7b94400f7f-util\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:12:21.504808 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:21.504771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" event={"ID":"b2e1a455-a21f-4e77-ab56-eb7b94400f7f","Type":"ContainerDied","Data":"bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f"} Apr 16 18:12:21.504808 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:21.504810 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd10c4aa0b63f7d5174f34bbd50c6abb96d9f73a34e72b1d3e39b624522e9e5f" Apr 16 18:12:21.505015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:21.504787 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c4p2zs" Apr 16 18:12:26.695072 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695039 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4"] Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695395 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="extract" Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695410 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="extract" Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695426 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="pull" Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695431 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="pull" Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695442 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="util" Apr 16 18:12:26.695468 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695447 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="util" Apr 16 18:12:26.695667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.695497 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2e1a455-a21f-4e77-ab56-eb7b94400f7f" containerName="extract" Apr 16 18:12:26.755795 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.755756 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4"] Apr 16 18:12:26.755966 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.755879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:26.759923 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.759896 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 18:12:26.759923 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.759913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qhmn5\"" Apr 16 18:12:26.760106 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.759951 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 18:12:26.760787 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.760774 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 18:12:26.908651 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.908612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:26.908825 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:26.908755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh72r\" (UniqueName: \"kubernetes.io/projected/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-kube-api-access-xh72r\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.010096 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.010010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh72r\" (UniqueName: \"kubernetes.io/projected/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-kube-api-access-xh72r\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.010096 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.010051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.012360 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.012339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.019943 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.019913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh72r\" (UniqueName: \"kubernetes.io/projected/d68a0da2-8d0e-4816-9d7f-0979e6972b7d-kube-api-access-xh72r\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4\" (UID: \"d68a0da2-8d0e-4816-9d7f-0979e6972b7d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.066344 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.066317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:27.202119 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.202083 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4"] Apr 16 18:12:27.205495 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:12:27.205464 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd68a0da2_8d0e_4816_9d7f_0979e6972b7d.slice/crio-bd4b1233946098d8f9b0edf23eb5f48e9c2e90d4a8c0906ec1f374cacca11b7f WatchSource:0}: Error finding container bd4b1233946098d8f9b0edf23eb5f48e9c2e90d4a8c0906ec1f374cacca11b7f: Status 404 returned error can't find the container with id bd4b1233946098d8f9b0edf23eb5f48e9c2e90d4a8c0906ec1f374cacca11b7f Apr 16 18:12:27.524899 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:27.524865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" event={"ID":"d68a0da2-8d0e-4816-9d7f-0979e6972b7d","Type":"ContainerStarted","Data":"bd4b1233946098d8f9b0edf23eb5f48e9c2e90d4a8c0906ec1f374cacca11b7f"} Apr 16 18:12:31.541718 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.541684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" event={"ID":"d68a0da2-8d0e-4816-9d7f-0979e6972b7d","Type":"ContainerStarted","Data":"bd710578f881ecd286e49e6a67f152130dbde5d8685dbc354599708960f54a59"} Apr 16 18:12:31.542225 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.541812 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:31.583531 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.583422 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" podStartSLOduration=1.954999457 podStartE2EDuration="5.583403862s" podCreationTimestamp="2026-04-16 18:12:26 +0000 UTC" firstStartedPulling="2026-04-16 18:12:27.207390765 +0000 UTC m=+621.320198196" lastFinishedPulling="2026-04-16 18:12:30.83579517 +0000 UTC m=+624.948602601" observedRunningTime="2026-04-16 18:12:31.582312568 +0000 UTC m=+625.695120023" watchObservedRunningTime="2026-04-16 18:12:31.583403862 +0000 UTC m=+625.696211316" Apr 16 18:12:31.619967 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.619932 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wljt8"] Apr 16 18:12:31.643085 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.643045 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wljt8"] Apr 16 18:12:31.643266 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.643158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.645763 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.645725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 18:12:31.645929 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.645785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 18:12:31.646058 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.646042 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rc7dn\"" Apr 16 18:12:31.750403 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.750360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/787fd2e9-c63f-4777-b6ce-bb0c8338707b-cabundle0\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.750584 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.750415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.750584 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.750548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjrm\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-kube-api-access-6qjrm\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.851836 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.851801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.852015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.851908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjrm\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-kube-api-access-6qjrm\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.852015 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.851949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/787fd2e9-c63f-4777-b6ce-bb0c8338707b-cabundle0\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.852015 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:31.851981 2577 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 18:12:31.852015 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:31.852001 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:12:31.852015 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:31.852012 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:12:31.852203 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:31.852028 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wljt8: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:12:31.852203 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:31.852099 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates podName:787fd2e9-c63f-4777-b6ce-bb0c8338707b nodeName:}" failed. No retries permitted until 2026-04-16 18:12:32.352078644 +0000 UTC m=+626.464886090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates") pod "keda-operator-ffbb595cb-wljt8" (UID: "787fd2e9-c63f-4777-b6ce-bb0c8338707b") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 18:12:31.852641 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.852622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/787fd2e9-c63f-4777-b6ce-bb0c8338707b-cabundle0\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:31.868664 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:31.868616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjrm\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-kube-api-access-6qjrm\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:32.007345 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.007307 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77"] Apr 16 18:12:32.028961 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.028926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.030537 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.030510 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77"] Apr 16 18:12:32.031870 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.031853 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 18:12:32.155479 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.155396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.155479 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.155441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf592\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-kube-api-access-zf592\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.155654 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.155515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/82275fad-c8cb-4854-b317-5f3729f1b1d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.256343 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.256303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.256550 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.256350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf592\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-kube-api-access-zf592\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.256550 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.256396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/82275fad-c8cb-4854-b317-5f3729f1b1d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.256550 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.256480 2577 secret.go:281] references non-existent secret key: tls.crt Apr 16 18:12:32.256550 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.256502 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 18:12:32.256550 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.256525 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77: references non-existent secret key: tls.crt Apr 16 18:12:32.256847 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.256597 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates podName:82275fad-c8cb-4854-b317-5f3729f1b1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:12:32.756577011 +0000 UTC m=+626.869384443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates") pod "keda-metrics-apiserver-7c9f485588-6xl77" (UID: "82275fad-c8cb-4854-b317-5f3729f1b1d1") : references non-existent secret key: tls.crt Apr 16 18:12:32.256984 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.256958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/82275fad-c8cb-4854-b317-5f3729f1b1d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.265396 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.265349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf592\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-kube-api-access-zf592\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.357335 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.357297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:32.357516 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.357426 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 18:12:32.357516 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.357440 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 18:12:32.357516 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.357449 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-wljt8: references non-existent secret key: ca.crt Apr 16 18:12:32.357516 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:12:32.357499 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates podName:787fd2e9-c63f-4777-b6ce-bb0c8338707b nodeName:}" failed. No retries permitted until 2026-04-16 18:12:33.357484239 +0000 UTC m=+627.470291685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates") pod "keda-operator-ffbb595cb-wljt8" (UID: "787fd2e9-c63f-4777-b6ce-bb0c8338707b") : references non-existent secret key: ca.crt Apr 16 18:12:32.760410 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.760350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.762861 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.762836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/82275fad-c8cb-4854-b317-5f3729f1b1d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-6xl77\" (UID: \"82275fad-c8cb-4854-b317-5f3729f1b1d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:32.943676 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:32.943637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:33.085307 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.085280 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77"] Apr 16 18:12:33.087401 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:12:33.087352 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82275fad_c8cb_4854_b317_5f3729f1b1d1.slice/crio-5070a22d999c6119369fa70ed8f699d701b06c7691323f6fa6de5564e92ef584 WatchSource:0}: Error finding container 5070a22d999c6119369fa70ed8f699d701b06c7691323f6fa6de5564e92ef584: Status 404 returned error can't find the container with id 5070a22d999c6119369fa70ed8f699d701b06c7691323f6fa6de5564e92ef584 Apr 16 18:12:33.365170 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.365131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:33.367648 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.367619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/787fd2e9-c63f-4777-b6ce-bb0c8338707b-certificates\") pod \"keda-operator-ffbb595cb-wljt8\" (UID: \"787fd2e9-c63f-4777-b6ce-bb0c8338707b\") " pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:33.454850 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.454807 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:33.550355 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.550314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" event={"ID":"82275fad-c8cb-4854-b317-5f3729f1b1d1","Type":"ContainerStarted","Data":"5070a22d999c6119369fa70ed8f699d701b06c7691323f6fa6de5564e92ef584"} Apr 16 18:12:33.583968 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:33.583933 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-wljt8"] Apr 16 18:12:33.586943 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:12:33.586913 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod787fd2e9_c63f_4777_b6ce_bb0c8338707b.slice/crio-865cab650f75ff9942b11d78fe68ae0035311e296330d24ba2d0ad4ad7c52074 WatchSource:0}: Error finding container 865cab650f75ff9942b11d78fe68ae0035311e296330d24ba2d0ad4ad7c52074: Status 404 returned error can't find the container with id 865cab650f75ff9942b11d78fe68ae0035311e296330d24ba2d0ad4ad7c52074 Apr 16 18:12:34.555544 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:34.555502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" event={"ID":"787fd2e9-c63f-4777-b6ce-bb0c8338707b","Type":"ContainerStarted","Data":"865cab650f75ff9942b11d78fe68ae0035311e296330d24ba2d0ad4ad7c52074"} Apr 16 18:12:37.574151 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.574114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" event={"ID":"82275fad-c8cb-4854-b317-5f3729f1b1d1","Type":"ContainerStarted","Data":"e8ca30c310962691951b58537671ca674e3c3b12427b7d068ec2586994874528"} Apr 16 18:12:37.574604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.574239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:37.575548 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.575525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" event={"ID":"787fd2e9-c63f-4777-b6ce-bb0c8338707b","Type":"ContainerStarted","Data":"d3bd81aa97a2c065912ac1216bf63e8b6ddb4ef803c336ee6df1889d628a91b3"} Apr 16 18:12:37.575618 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.575607 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:12:37.594001 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.593943 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" podStartSLOduration=2.9786049439999998 podStartE2EDuration="6.5939246s" podCreationTimestamp="2026-04-16 18:12:31 +0000 UTC" firstStartedPulling="2026-04-16 18:12:33.088589233 +0000 UTC m=+627.201396665" lastFinishedPulling="2026-04-16 18:12:36.703908888 +0000 UTC m=+630.816716321" observedRunningTime="2026-04-16 18:12:37.592148863 +0000 UTC m=+631.704956317" watchObservedRunningTime="2026-04-16 18:12:37.5939246 +0000 UTC m=+631.706732057" Apr 16 18:12:37.609723 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:37.609673 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" podStartSLOduration=3.487924988 podStartE2EDuration="6.60965709s" podCreationTimestamp="2026-04-16 18:12:31 +0000 UTC" firstStartedPulling="2026-04-16 18:12:33.588354422 +0000 UTC m=+627.701161854" lastFinishedPulling="2026-04-16 18:12:36.71008651 +0000 UTC m=+630.822893956" observedRunningTime="2026-04-16 18:12:37.607782674 +0000 UTC m=+631.720590129" watchObservedRunningTime="2026-04-16 18:12:37.60965709 +0000 UTC m=+631.722464544" Apr 16 18:12:48.582940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:48.582912 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-6xl77" Apr 16 18:12:52.547508 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:52.547473 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-xpgd4" Apr 16 18:12:58.580552 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:12:58.580522 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-wljt8" Apr 16 18:13:44.572167 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.572128 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn"] Apr 16 18:13:44.575994 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.575973 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:44.578663 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.578642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:13:44.578793 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.578707 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:13:44.579054 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.579035 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:13:44.579294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.579281 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hjgtw\"" Apr 16 18:13:44.595011 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.594980 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn"] Apr 16 18:13:44.683726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.683678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:44.683914 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.683798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlv5w\" (UniqueName: \"kubernetes.io/projected/88b503cc-4043-4ece-9a71-22032bc4097f-kube-api-access-tlv5w\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:44.785140 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.785093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:44.785350 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.785178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlv5w\" (UniqueName: \"kubernetes.io/projected/88b503cc-4043-4ece-9a71-22032bc4097f-kube-api-access-tlv5w\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:44.785350 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:13:44.785269 2577 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 18:13:44.785505 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:13:44.785359 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert podName:88b503cc-4043-4ece-9a71-22032bc4097f nodeName:}" failed. No retries permitted until 2026-04-16 18:13:45.285338329 +0000 UTC m=+699.398145766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert") pod "llmisvc-controller-manager-68cc5db7c4-2xfjn" (UID: "88b503cc-4043-4ece-9a71-22032bc4097f") : secret "llmisvc-webhook-server-cert" not found Apr 16 18:13:44.795271 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:44.795245 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlv5w\" (UniqueName: \"kubernetes.io/projected/88b503cc-4043-4ece-9a71-22032bc4097f-kube-api-access-tlv5w\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:45.290698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:45.290660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:45.293020 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:45.292992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88b503cc-4043-4ece-9a71-22032bc4097f-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-2xfjn\" (UID: \"88b503cc-4043-4ece-9a71-22032bc4097f\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:45.485892 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:45.485856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:45.621175 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:45.621143 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn"] Apr 16 18:13:45.625299 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:13:45.625270 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod88b503cc_4043_4ece_9a71_22032bc4097f.slice/crio-2a1163934c5ebaabd2abead0542147181d209d35e4bff85c9541e8b852140559 WatchSource:0}: Error finding container 2a1163934c5ebaabd2abead0542147181d209d35e4bff85c9541e8b852140559: Status 404 returned error can't find the container with id 2a1163934c5ebaabd2abead0542147181d209d35e4bff85c9541e8b852140559 Apr 16 18:13:45.804314 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:45.804281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" event={"ID":"88b503cc-4043-4ece-9a71-22032bc4097f","Type":"ContainerStarted","Data":"2a1163934c5ebaabd2abead0542147181d209d35e4bff85c9541e8b852140559"} Apr 16 18:13:47.813703 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:47.813614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" event={"ID":"88b503cc-4043-4ece-9a71-22032bc4097f","Type":"ContainerStarted","Data":"16e6b1538b4703781a37571926380b243a349350ddf85a62142f1732b7e929ea"} Apr 16 18:13:47.814075 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:47.813745 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:13:47.829992 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:13:47.829942 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" podStartSLOduration=1.928819436 podStartE2EDuration="3.829926028s" podCreationTimestamp="2026-04-16 18:13:44 +0000 UTC" firstStartedPulling="2026-04-16 18:13:45.626532549 +0000 UTC m=+699.739339981" lastFinishedPulling="2026-04-16 18:13:47.527639127 +0000 UTC m=+701.640446573" observedRunningTime="2026-04-16 18:13:47.828343936 +0000 UTC m=+701.941151390" watchObservedRunningTime="2026-04-16 18:13:47.829926028 +0000 UTC m=+701.942733481" Apr 16 18:14:18.819090 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:18.819053 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-2xfjn" Apr 16 18:14:53.560334 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.560298 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-96vhg"] Apr 16 18:14:53.563518 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.563498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.565649 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.565626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:14:53.565745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.565661 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-vfl9r\"" Apr 16 18:14:53.573589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.573561 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-96vhg"] Apr 16 18:14:53.684108 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.684066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdfsc\" (UniqueName: \"kubernetes.io/projected/250ecb81-0337-4f25-8bb2-bfb98eb93768-kube-api-access-hdfsc\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.684266 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.684174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/250ecb81-0337-4f25-8bb2-bfb98eb93768-tls-certs\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.785359 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.785329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/250ecb81-0337-4f25-8bb2-bfb98eb93768-tls-certs\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.785538 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.785407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdfsc\" (UniqueName: \"kubernetes.io/projected/250ecb81-0337-4f25-8bb2-bfb98eb93768-kube-api-access-hdfsc\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.787661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.787635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/250ecb81-0337-4f25-8bb2-bfb98eb93768-tls-certs\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.793150 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.793126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdfsc\" (UniqueName: \"kubernetes.io/projected/250ecb81-0337-4f25-8bb2-bfb98eb93768-kube-api-access-hdfsc\") pod \"model-serving-api-86f7b4b499-96vhg\" (UID: \"250ecb81-0337-4f25-8bb2-bfb98eb93768\") " pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:53.874571 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:53.874547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:54.002932 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:54.002909 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-96vhg"] Apr 16 18:14:54.005941 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:14:54.005915 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250ecb81_0337_4f25_8bb2_bfb98eb93768.slice/crio-d616160a2e8b1b5270931f9ff72e2141f5d0c060136cb2c3160147d9b8c0c161 WatchSource:0}: Error finding container d616160a2e8b1b5270931f9ff72e2141f5d0c060136cb2c3160147d9b8c0c161: Status 404 returned error can't find the container with id d616160a2e8b1b5270931f9ff72e2141f5d0c060136cb2c3160147d9b8c0c161 Apr 16 18:14:54.049871 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:54.049842 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-96vhg" event={"ID":"250ecb81-0337-4f25-8bb2-bfb98eb93768","Type":"ContainerStarted","Data":"d616160a2e8b1b5270931f9ff72e2141f5d0c060136cb2c3160147d9b8c0c161"} Apr 16 18:14:57.062987 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:57.062951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-96vhg" event={"ID":"250ecb81-0337-4f25-8bb2-bfb98eb93768","Type":"ContainerStarted","Data":"e60b84f4d7cc2182ba4b2e391c85c70937a509ed1eaa2eebfa4dd1132b8c3918"} Apr 16 18:14:57.063402 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:57.063071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:14:57.085409 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:14:57.085342 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-96vhg" podStartSLOduration=1.8893129910000002 podStartE2EDuration="4.085327825s" podCreationTimestamp="2026-04-16 18:14:53 +0000 UTC" firstStartedPulling="2026-04-16 18:14:54.007883413 +0000 UTC m=+768.120690848" lastFinishedPulling="2026-04-16 18:14:56.203898235 +0000 UTC m=+770.316705682" observedRunningTime="2026-04-16 18:14:57.083267378 +0000 UTC m=+771.196074833" watchObservedRunningTime="2026-04-16 18:14:57.085327825 +0000 UTC m=+771.198135278" Apr 16 18:15:08.069770 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:08.069739 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-96vhg" Apr 16 18:15:30.767749 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:30.767642 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:15:30.772351 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:30.772326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:15:30.775515 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:30.775494 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-zqf8f\"" Apr 16 18:15:30.781872 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:30.781853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:15:30.955936 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:30.955903 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:15:31.146090 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.146060 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:15:31.146960 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:15:31.146929 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db464a8_101f_4abb_84dc_c2dd96792242.slice/crio-1a72083cd660a16b28aa48abba8796a3061ffe79e50ee360e12d1ea04d49598d WatchSource:0}: Error finding container 1a72083cd660a16b28aa48abba8796a3061ffe79e50ee360e12d1ea04d49598d: Status 404 returned error can't find the container with id 1a72083cd660a16b28aa48abba8796a3061ffe79e50ee360e12d1ea04d49598d Apr 16 18:15:31.147842 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.147823 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:15:31.152941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.152919 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:15:31.179216 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.179185 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:15:31.183437 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.183404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" event={"ID":"3db464a8-101f-4abb-84dc-c2dd96792242","Type":"ContainerStarted","Data":"1a72083cd660a16b28aa48abba8796a3061ffe79e50ee360e12d1ea04d49598d"} Apr 16 18:15:31.257254 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.257218 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:15:31.260810 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.260786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:31.275323 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.275294 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:15:31.278660 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.278631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-ll9ps\" (UID: \"2847bd53-2a20-4981-a0d9-29a0392aef9c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:15:31.379617 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.379582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-ll9ps\" (UID: \"2847bd53-2a20-4981-a0d9-29a0392aef9c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:15:31.379808 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.379644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts\" (UID: \"6e7fd811-a17a-4ab8-8c18-e26dda756487\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:31.379997 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.379980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-ll9ps\" (UID: \"2847bd53-2a20-4981-a0d9-29a0392aef9c\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:15:31.467388 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.467339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:15:31.480498 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.480464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts\" (UID: \"6e7fd811-a17a-4ab8-8c18-e26dda756487\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:31.480943 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.480914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts\" (UID: \"6e7fd811-a17a-4ab8-8c18-e26dda756487\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:31.573405 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.573342 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:31.610354 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.610319 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:15:31.614701 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:15:31.614669 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2847bd53_2a20_4981_a0d9_29a0392aef9c.slice/crio-799f47971e363011829276f22da686adc4e2d77748b3ce88232a09554a2950ea WatchSource:0}: Error finding container 799f47971e363011829276f22da686adc4e2d77748b3ce88232a09554a2950ea: Status 404 returned error can't find the container with id 799f47971e363011829276f22da686adc4e2d77748b3ce88232a09554a2950ea Apr 16 18:15:31.717665 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:31.717636 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:15:31.720207 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:15:31.720171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7fd811_a17a_4ab8_8c18_e26dda756487.slice/crio-643413b72a6919488dfcb5145813846ecd426176ef6ae15dfb7e92278b7aeebd WatchSource:0}: Error finding container 643413b72a6919488dfcb5145813846ecd426176ef6ae15dfb7e92278b7aeebd: Status 404 returned error can't find the container with id 643413b72a6919488dfcb5145813846ecd426176ef6ae15dfb7e92278b7aeebd Apr 16 18:15:32.196509 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:32.196417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerStarted","Data":"643413b72a6919488dfcb5145813846ecd426176ef6ae15dfb7e92278b7aeebd"} Apr 16 18:15:32.199329 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:32.199266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerStarted","Data":"799f47971e363011829276f22da686adc4e2d77748b3ce88232a09554a2950ea"} Apr 16 18:15:45.267058 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.267014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerStarted","Data":"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687"} Apr 16 18:15:45.268353 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.268318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" event={"ID":"3db464a8-101f-4abb-84dc-c2dd96792242","Type":"ContainerStarted","Data":"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8"} Apr 16 18:15:45.268514 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.268498 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:15:45.269810 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.269779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerStarted","Data":"cda4f8aca254227e57d6e6ef1cdacbb9d998e2ba2032919a870818ba75dff942"} Apr 16 18:15:45.269975 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.269920 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:15:45.298940 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:45.298884 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podStartSLOduration=1.799326964 podStartE2EDuration="15.298867068s" podCreationTimestamp="2026-04-16 18:15:30 +0000 UTC" firstStartedPulling="2026-04-16 18:15:31.149060227 +0000 UTC m=+805.261867658" lastFinishedPulling="2026-04-16 18:15:44.648600317 +0000 UTC m=+818.761407762" observedRunningTime="2026-04-16 18:15:45.297796507 +0000 UTC m=+819.410603961" watchObservedRunningTime="2026-04-16 18:15:45.298867068 +0000 UTC m=+819.411674513" Apr 16 18:15:46.273345 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:46.273298 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:15:49.287558 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:49.287522 2577 generic.go:358] "Generic (PLEG): container finished" podID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerID="cda4f8aca254227e57d6e6ef1cdacbb9d998e2ba2032919a870818ba75dff942" exitCode=0 Apr 16 18:15:49.287973 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:49.287592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerDied","Data":"cda4f8aca254227e57d6e6ef1cdacbb9d998e2ba2032919a870818ba75dff942"} Apr 16 18:15:49.288721 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:49.288705 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:15:49.289027 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:49.289003 2577 generic.go:358] "Generic (PLEG): container finished" podID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerID="4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687" exitCode=0 Apr 16 18:15:49.289127 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:49.289089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerDied","Data":"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687"} Apr 16 18:15:56.273875 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:56.273822 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:15:57.325905 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:57.325862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerStarted","Data":"41afa2499d2195fa5b5364c1fba756cefb5b85c009eea4f64ce380745664f686"} Apr 16 18:15:57.326655 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:57.326633 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:15:57.328251 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:57.328186 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:15:57.342662 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:57.342603 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podStartSLOduration=1.78676164 podStartE2EDuration="26.342587147s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="2026-04-16 18:15:31.723947453 +0000 UTC m=+805.836754890" lastFinishedPulling="2026-04-16 18:15:56.279772948 +0000 UTC m=+830.392580397" observedRunningTime="2026-04-16 18:15:57.341442668 +0000 UTC m=+831.454250137" watchObservedRunningTime="2026-04-16 18:15:57.342587147 +0000 UTC m=+831.455394601" Apr 16 18:15:58.330212 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:15:58.330172 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:06.273604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:06.273560 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:16:08.330390 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:08.330322 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:08.368492 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:08.368453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerStarted","Data":"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664"} Apr 16 18:16:08.368872 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:08.368847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:16:08.370194 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:08.370167 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:09.372557 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:09.372518 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:16.274121 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:16.274074 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:16:18.330390 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:18.330330 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:19.373186 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:19.373138 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:26.273738 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:26.273699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:16:28.330284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:28.330238 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:29.373454 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:29.373404 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:36.273820 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:36.273784 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:16:36.290247 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:36.290190 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podStartSLOduration=29.212675436 podStartE2EDuration="1m5.290173689s" podCreationTimestamp="2026-04-16 18:15:31 +0000 UTC" firstStartedPulling="2026-04-16 18:15:31.616631391 +0000 UTC m=+805.729438826" lastFinishedPulling="2026-04-16 18:16:07.694129647 +0000 UTC m=+841.806937079" observedRunningTime="2026-04-16 18:16:08.38876568 +0000 UTC m=+842.501573135" watchObservedRunningTime="2026-04-16 18:16:36.290173689 +0000 UTC m=+870.402981143" Apr 16 18:16:38.330120 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:38.330076 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:39.372752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:39.372705 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:48.330853 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:48.330805 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:49.373216 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:49.373164 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:16:58.330814 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:58.330775 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:16:59.373308 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:16:59.373265 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:17:00.681708 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.681678 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:17:00.682112 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.681921 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" containerID="cri-o://be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8" gracePeriod=30 Apr 16 18:17:00.757248 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.757217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:17:00.761024 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.761003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:17:00.766065 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.766046 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:17:00.772798 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:00.772781 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:17:01.106780 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.106753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:17:01.109585 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:17:01.109553 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fab5246_d585_4dd5_90e6_2902819b30a3.slice/crio-666556ab935c635f5cdb306b2edf7fe43183b3434b559cb92855b2e5fa33cfe9 WatchSource:0}: Error finding container 666556ab935c635f5cdb306b2edf7fe43183b3434b559cb92855b2e5fa33cfe9: Status 404 returned error can't find the container with id 666556ab935c635f5cdb306b2edf7fe43183b3434b559cb92855b2e5fa33cfe9 Apr 16 18:17:01.559249 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.559166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" event={"ID":"7fab5246-d585-4dd5-90e6-2902819b30a3","Type":"ContainerStarted","Data":"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d"} Apr 16 18:17:01.559249 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.559212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" event={"ID":"7fab5246-d585-4dd5-90e6-2902819b30a3","Type":"ContainerStarted","Data":"666556ab935c635f5cdb306b2edf7fe43183b3434b559cb92855b2e5fa33cfe9"} Apr 16 18:17:01.559454 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.559322 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:17:01.560695 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.560673 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:01.574420 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:01.574359 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podStartSLOduration=1.574346545 podStartE2EDuration="1.574346545s" podCreationTimestamp="2026-04-16 18:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:01.573110922 +0000 UTC m=+895.685918375" watchObservedRunningTime="2026-04-16 18:17:01.574346545 +0000 UTC m=+895.687154059" Apr 16 18:17:02.563005 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:02.562970 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:03.473294 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:03.473254 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:17:03.735896 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:03.735834 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:17:04.569825 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.569789 2577 generic.go:358] "Generic (PLEG): container finished" podID="3db464a8-101f-4abb-84dc-c2dd96792242" containerID="be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8" exitCode=0 Apr 16 18:17:04.569977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.569872 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" Apr 16 18:17:04.569977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.569874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" event={"ID":"3db464a8-101f-4abb-84dc-c2dd96792242","Type":"ContainerDied","Data":"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8"} Apr 16 18:17:04.569977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.569912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4" event={"ID":"3db464a8-101f-4abb-84dc-c2dd96792242","Type":"ContainerDied","Data":"1a72083cd660a16b28aa48abba8796a3061ffe79e50ee360e12d1ea04d49598d"} Apr 16 18:17:04.569977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.569930 2577 scope.go:117] "RemoveContainer" containerID="be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8" Apr 16 18:17:04.578116 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.578101 2577 scope.go:117] "RemoveContainer" containerID="be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8" Apr 16 18:17:04.578391 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:17:04.578359 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8\": container with ID starting with be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8 not found: ID does not exist" containerID="be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8" Apr 16 18:17:04.578489 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.578399 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8"} err="failed to get container status \"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8\": rpc error: code = NotFound desc = could not find container \"be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8\": container with ID starting with be0c476ce2b8d2ee76e3ac5e24bfad17727f53c81c9501f7f68b79d26cfbd9d8 not found: ID does not exist" Apr 16 18:17:04.586094 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.586072 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:17:04.589478 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:04.589458 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390f9-predictor-666f56949b-vxbx4"] Apr 16 18:17:06.409760 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:06.409735 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:17:06.413729 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:06.413710 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:17:06.417337 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:06.417315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:17:06.421606 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:06.421589 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:17:06.479678 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:06.479657 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" path="/var/lib/kubelet/pods/3db464a8-101f-4abb-84dc-c2dd96792242/volumes" Apr 16 18:17:09.374390 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:09.374344 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:17:12.563111 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:12.563069 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:13.474941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:13.474910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:17:22.563514 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:22.563468 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:30.703934 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.703899 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:17:30.704462 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.704445 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" Apr 16 18:17:30.704529 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.704464 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" Apr 16 18:17:30.704576 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.704539 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db464a8-101f-4abb-84dc-c2dd96792242" containerName="kserve-container" Apr 16 18:17:30.707590 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.707572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:17:30.718981 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.718957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:17:30.725919 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.725878 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:17:30.726334 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.726283 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" containerID="cri-o://5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664" gracePeriod=30 Apr 16 18:17:30.728113 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.728087 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:17:30.835950 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.835907 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:17:30.836515 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.836279 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" containerID="cri-o://41afa2499d2195fa5b5364c1fba756cefb5b85c009eea4f64ce380745664f686" gracePeriod=30 Apr 16 18:17:30.889237 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:30.889201 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:17:30.893428 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:17:30.893394 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b84db2_4bf3_418a_8219_eeca35b69e0a.slice/crio-6a9479b7c1798cb815363167b0c52060ab85b6de2a28872f8aafb1e7be476044 WatchSource:0}: Error finding container 6a9479b7c1798cb815363167b0c52060ab85b6de2a28872f8aafb1e7be476044: Status 404 returned error can't find the container with id 6a9479b7c1798cb815363167b0c52060ab85b6de2a28872f8aafb1e7be476044 Apr 16 18:17:31.667705 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:31.667661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" event={"ID":"d1b84db2-4bf3-418a-8219-eeca35b69e0a","Type":"ContainerStarted","Data":"c5e64c06665436a8d030e5e55765435245d852ff631924031f61d4c95b99f0a3"} Apr 16 18:17:31.667705 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:31.667711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" event={"ID":"d1b84db2-4bf3-418a-8219-eeca35b69e0a","Type":"ContainerStarted","Data":"6a9479b7c1798cb815363167b0c52060ab85b6de2a28872f8aafb1e7be476044"} Apr 16 18:17:31.667970 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:31.667860 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:17:31.669274 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:31.669246 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:17:31.683272 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:31.683215 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podStartSLOduration=1.683196751 podStartE2EDuration="1.683196751s" podCreationTimestamp="2026-04-16 18:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:31.681575465 +0000 UTC m=+925.794382919" watchObservedRunningTime="2026-04-16 18:17:31.683196751 +0000 UTC m=+925.796004204" Apr 16 18:17:32.563435 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:32.563341 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:32.672072 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:32.672029 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:17:33.473477 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:33.473437 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:17:35.679026 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.679001 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:17:35.683048 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.683022 2577 generic.go:358] "Generic (PLEG): container finished" podID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerID="5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664" exitCode=0 Apr 16 18:17:35.683174 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.683083 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" Apr 16 18:17:35.683174 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.683098 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerDied","Data":"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664"} Apr 16 18:17:35.683174 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.683132 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps" event={"ID":"2847bd53-2a20-4981-a0d9-29a0392aef9c","Type":"ContainerDied","Data":"799f47971e363011829276f22da686adc4e2d77748b3ce88232a09554a2950ea"} Apr 16 18:17:35.683174 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.683146 2577 scope.go:117] "RemoveContainer" containerID="5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664" Apr 16 18:17:35.691037 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.691018 2577 scope.go:117] "RemoveContainer" containerID="4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687" Apr 16 18:17:35.699688 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.699671 2577 scope.go:117] "RemoveContainer" containerID="5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664" Apr 16 18:17:35.699997 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:17:35.699978 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664\": container with ID starting with 5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664 not found: ID does not exist" containerID="5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664" Apr 16 18:17:35.700078 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.700007 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664"} err="failed to get container status \"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664\": rpc error: code = NotFound desc = could not find container \"5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664\": container with ID starting with 5b761ee01cf95f02d0a3f19390d5ed7549d69bb0f849f44916019f53fff0b664 not found: ID does not exist" Apr 16 18:17:35.700078 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.700027 2577 scope.go:117] "RemoveContainer" containerID="4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687" Apr 16 18:17:35.700276 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:17:35.700258 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687\": container with ID starting with 4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687 not found: ID does not exist" containerID="4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687" Apr 16 18:17:35.700317 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.700283 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687"} err="failed to get container status \"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687\": rpc error: code = NotFound desc = could not find container \"4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687\": container with ID starting with 4791a393e9d6f5349651458e349589ff175ede5149ff0b552a60a1e5713b6687 not found: ID does not exist" Apr 16 18:17:35.809950 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.809853 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location\") pod \"2847bd53-2a20-4981-a0d9-29a0392aef9c\" (UID: \"2847bd53-2a20-4981-a0d9-29a0392aef9c\") " Apr 16 18:17:35.810192 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.810169 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2847bd53-2a20-4981-a0d9-29a0392aef9c" (UID: "2847bd53-2a20-4981-a0d9-29a0392aef9c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:17:35.911154 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:35.911113 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2847bd53-2a20-4981-a0d9-29a0392aef9c-kserve-provision-location\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:17:36.005012 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.004979 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:17:36.009269 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.009245 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-ll9ps"] Apr 16 18:17:36.477562 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.477523 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" path="/var/lib/kubelet/pods/2847bd53-2a20-4981-a0d9-29a0392aef9c/volumes" Apr 16 18:17:36.689064 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.689033 2577 generic.go:358] "Generic (PLEG): container finished" podID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerID="41afa2499d2195fa5b5364c1fba756cefb5b85c009eea4f64ce380745664f686" exitCode=0 Apr 16 18:17:36.689523 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.689104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerDied","Data":"41afa2499d2195fa5b5364c1fba756cefb5b85c009eea4f64ce380745664f686"} Apr 16 18:17:36.786863 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.786840 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:17:36.920856 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.920816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location\") pod \"6e7fd811-a17a-4ab8-8c18-e26dda756487\" (UID: \"6e7fd811-a17a-4ab8-8c18-e26dda756487\") " Apr 16 18:17:36.921138 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:36.921117 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e7fd811-a17a-4ab8-8c18-e26dda756487" (UID: "6e7fd811-a17a-4ab8-8c18-e26dda756487"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:17:37.022534 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.022428 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e7fd811-a17a-4ab8-8c18-e26dda756487-kserve-provision-location\") on node \"ip-10-0-134-133.ec2.internal\" DevicePath \"\"" Apr 16 18:17:37.695284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.695249 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" event={"ID":"6e7fd811-a17a-4ab8-8c18-e26dda756487","Type":"ContainerDied","Data":"643413b72a6919488dfcb5145813846ecd426176ef6ae15dfb7e92278b7aeebd"} Apr 16 18:17:37.695284 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.695266 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts" Apr 16 18:17:37.695858 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.695297 2577 scope.go:117] "RemoveContainer" containerID="41afa2499d2195fa5b5364c1fba756cefb5b85c009eea4f64ce380745664f686" Apr 16 18:17:37.703799 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.703779 2577 scope.go:117] "RemoveContainer" containerID="cda4f8aca254227e57d6e6ef1cdacbb9d998e2ba2032919a870818ba75dff942" Apr 16 18:17:37.717667 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.717639 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:17:37.722555 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:37.722532 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-d76b874f9-gp6ts"] Apr 16 18:17:38.477762 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:38.477733 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" path="/var/lib/kubelet/pods/6e7fd811-a17a-4ab8-8c18-e26dda756487/volumes" Apr 16 18:17:42.563019 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:42.562983 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:17:42.672553 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:42.672514 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:17:52.564570 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:52.564519 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:17:52.672921 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:17:52.672875 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:18:02.672614 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:18:02.672575 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:18:12.672537 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:18:12.672500 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 18:18:22.674253 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:18:22.674173 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:22:06.436209 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:22:06.436176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:22:06.442240 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:22:06.442215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:22:06.443332 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:22:06.443313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:22:06.449310 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:22:06.449292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:26:25.782198 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.782161 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:26:25.782799 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.782474 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" containerID="cri-o://73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d" gracePeriod=30 Apr 16 18:26:25.811039 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.810997 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:26:25.811563 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811545 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="storage-initializer" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811565 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="storage-initializer" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811582 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811588 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811596 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="storage-initializer" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811603 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="storage-initializer" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811614 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" Apr 16 18:26:25.811623 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811620 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" Apr 16 18:26:25.811857 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811679 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2847bd53-2a20-4981-a0d9-29a0392aef9c" containerName="kserve-container" Apr 16 18:26:25.811857 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.811692 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e7fd811-a17a-4ab8-8c18-e26dda756487" containerName="kserve-container" Apr 16 18:26:25.814631 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.814615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:26:25.822924 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.822903 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:26:25.825265 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.825244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:26:25.962109 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.960049 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:26:25.966826 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:25.966801 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:26:26.596352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:26.596315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" event={"ID":"362627f4-bd02-4e6f-aa32-4bc5fc2874f8","Type":"ContainerStarted","Data":"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f"} Apr 16 18:26:26.596352 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:26.596358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" event={"ID":"362627f4-bd02-4e6f-aa32-4bc5fc2874f8","Type":"ContainerStarted","Data":"2bcd19fdcc8e6adfbf3a18afd80327c379cf3445b4d988f7d84e3f0a94e8fead"} Apr 16 18:26:26.596603 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:26.596476 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:26:26.597872 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:26.597845 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:26:26.613450 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:26.613405 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podStartSLOduration=1.613392895 podStartE2EDuration="1.613392895s" podCreationTimestamp="2026-04-16 18:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:26.611553605 +0000 UTC m=+1460.724361072" watchObservedRunningTime="2026-04-16 18:26:26.613392895 +0000 UTC m=+1460.726200345" Apr 16 18:26:27.600089 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:27.600050 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:26:29.032263 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.032237 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:26:29.607051 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.607009 2577 generic.go:358] "Generic (PLEG): container finished" podID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerID="73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d" exitCode=0 Apr 16 18:26:29.607233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.607073 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" Apr 16 18:26:29.607233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.607088 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" event={"ID":"7fab5246-d585-4dd5-90e6-2902819b30a3","Type":"ContainerDied","Data":"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d"} Apr 16 18:26:29.607233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.607125 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl" event={"ID":"7fab5246-d585-4dd5-90e6-2902819b30a3","Type":"ContainerDied","Data":"666556ab935c635f5cdb306b2edf7fe43183b3434b559cb92855b2e5fa33cfe9"} Apr 16 18:26:29.607233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.607140 2577 scope.go:117] "RemoveContainer" containerID="73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d" Apr 16 18:26:29.618770 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.618753 2577 scope.go:117] "RemoveContainer" containerID="73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d" Apr 16 18:26:29.619017 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:26:29.618999 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d\": container with ID starting with 73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d not found: ID does not exist" containerID="73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d" Apr 16 18:26:29.619081 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.619027 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d"} err="failed to get container status \"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d\": rpc error: code = NotFound desc = could not find container \"73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d\": container with ID starting with 73b678586f71410861115c831de5b36af72355e0134cd75927c5252a6294c74d not found: ID does not exist" Apr 16 18:26:29.630606 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.630574 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:26:29.635888 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:29.635864 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-04c3d-predictor-7c964cdb94-hzrhl"] Apr 16 18:26:30.477186 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:30.477156 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" path="/var/lib/kubelet/pods/7fab5246-d585-4dd5-90e6-2902819b30a3/volumes" Apr 16 18:26:37.601008 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:37.600968 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:26:47.600201 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:47.600155 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:26:55.630928 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.630889 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:26:55.631431 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.631203 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" containerID="cri-o://c5e64c06665436a8d030e5e55765435245d852ff631924031f61d4c95b99f0a3" gracePeriod=30 Apr 16 18:26:55.655759 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.655727 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:26:55.656336 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.656309 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" Apr 16 18:26:55.656336 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.656334 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" Apr 16 18:26:55.656564 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.656464 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fab5246-d585-4dd5-90e6-2902819b30a3" containerName="kserve-container" Apr 16 18:26:55.660851 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.660830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:26:55.665516 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.665493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:26:55.674424 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.674403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:26:55.810270 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:55.810231 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:26:55.816454 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:26:55.816410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bcb2ca_e0ce_4d51_a4d9_428ae55d0989.slice/crio-14ca7d0c521a104b8f6350c2415d386f4d063fdac98e2668eff760526d2c9ff7 WatchSource:0}: Error finding container 14ca7d0c521a104b8f6350c2415d386f4d063fdac98e2668eff760526d2c9ff7: Status 404 returned error can't find the container with id 14ca7d0c521a104b8f6350c2415d386f4d063fdac98e2668eff760526d2c9ff7 Apr 16 18:26:56.705455 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:56.705419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" event={"ID":"99bcb2ca-e0ce-4d51-a4d9-428ae55d0989","Type":"ContainerStarted","Data":"8dfb7af59d3175df451684d653a125b424f01646d59d898f2577147757acd923"} Apr 16 18:26:56.705455 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:56.705459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" event={"ID":"99bcb2ca-e0ce-4d51-a4d9-428ae55d0989","Type":"ContainerStarted","Data":"14ca7d0c521a104b8f6350c2415d386f4d063fdac98e2668eff760526d2c9ff7"} Apr 16 18:26:56.706048 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:56.705616 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:26:56.706890 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:56.706862 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:26:57.600956 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:57.600912 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:26:57.710043 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:57.710008 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:26:58.715988 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:58.715957 2577 generic.go:358] "Generic (PLEG): container finished" podID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerID="c5e64c06665436a8d030e5e55765435245d852ff631924031f61d4c95b99f0a3" exitCode=0 Apr 16 18:26:58.716467 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:58.716032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" event={"ID":"d1b84db2-4bf3-418a-8219-eeca35b69e0a","Type":"ContainerDied","Data":"c5e64c06665436a8d030e5e55765435245d852ff631924031f61d4c95b99f0a3"} Apr 16 18:26:58.780029 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:58.779999 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:26:58.795345 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:58.795291 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podStartSLOduration=3.795271178 podStartE2EDuration="3.795271178s" podCreationTimestamp="2026-04-16 18:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:26:56.718995859 +0000 UTC m=+1490.831803313" watchObservedRunningTime="2026-04-16 18:26:58.795271178 +0000 UTC m=+1492.908078635" Apr 16 18:26:59.721880 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:59.721849 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" Apr 16 18:26:59.722339 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:59.721848 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw" event={"ID":"d1b84db2-4bf3-418a-8219-eeca35b69e0a","Type":"ContainerDied","Data":"6a9479b7c1798cb815363167b0c52060ab85b6de2a28872f8aafb1e7be476044"} Apr 16 18:26:59.722339 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:59.721974 2577 scope.go:117] "RemoveContainer" containerID="c5e64c06665436a8d030e5e55765435245d852ff631924031f61d4c95b99f0a3" Apr 16 18:26:59.744698 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:59.744670 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:26:59.749580 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:26:59.749557 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0403e-predictor-65bc898d66-f8rjw"] Apr 16 18:27:00.477864 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:00.477831 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" path="/var/lib/kubelet/pods/d1b84db2-4bf3-418a-8219-eeca35b69e0a/volumes" Apr 16 18:27:06.462191 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:06.462155 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:27:06.468722 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:06.468694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:27:06.469746 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:06.469720 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:27:06.476657 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:06.476628 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:27:07.600969 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:07.600923 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:27:07.710652 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:07.710616 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:27:17.601365 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:17.601327 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:27:17.711001 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:17.710953 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:27:27.710476 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:27.710345 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:27:37.710941 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:37.710902 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:27:46.101179 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.101142 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:27:46.101800 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.101436 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" containerID="cri-o://dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f" gracePeriod=30 Apr 16 18:27:46.120232 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.120194 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:27:46.120647 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.120624 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" Apr 16 18:27:46.120647 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.120640 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" Apr 16 18:27:46.121056 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.120733 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1b84db2-4bf3-418a-8219-eeca35b69e0a" containerName="kserve-container" Apr 16 18:27:46.123957 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.123941 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:27:46.128254 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.128227 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:27:46.134356 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.134337 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:27:46.262503 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.262473 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:27:46.265321 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:27:46.265286 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7554a6c0_607d_4699_ae6a_3825504b1d58.slice/crio-9bfd68fba8cc1f3c4faa5369be4e74d6f1804a4ef09635e7fd04b0e73e914438 WatchSource:0}: Error finding container 9bfd68fba8cc1f3c4faa5369be4e74d6f1804a4ef09635e7fd04b0e73e914438: Status 404 returned error can't find the container with id 9bfd68fba8cc1f3c4faa5369be4e74d6f1804a4ef09635e7fd04b0e73e914438 Apr 16 18:27:46.890291 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.890250 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" event={"ID":"7554a6c0-607d-4699-ae6a-3825504b1d58","Type":"ContainerStarted","Data":"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548"} Apr 16 18:27:46.890291 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.890294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" event={"ID":"7554a6c0-607d-4699-ae6a-3825504b1d58","Type":"ContainerStarted","Data":"9bfd68fba8cc1f3c4faa5369be4e74d6f1804a4ef09635e7fd04b0e73e914438"} Apr 16 18:27:46.890575 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.890398 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:27:46.891501 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.891477 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:27:46.918069 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:46.918017 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podStartSLOduration=0.918001392 podStartE2EDuration="918.001392ms" podCreationTimestamp="2026-04-16 18:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:46.917224132 +0000 UTC m=+1541.030031586" watchObservedRunningTime="2026-04-16 18:27:46.918001392 +0000 UTC m=+1541.030808847" Apr 16 18:27:47.601091 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:47.601047 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 18:27:47.711571 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:47.711542 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:27:47.893886 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:47.893793 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:27:49.444586 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.444555 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:27:49.900922 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.900891 2577 generic.go:358] "Generic (PLEG): container finished" podID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerID="dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f" exitCode=0 Apr 16 18:27:49.901102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.900953 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" Apr 16 18:27:49.901102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.900978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" event={"ID":"362627f4-bd02-4e6f-aa32-4bc5fc2874f8","Type":"ContainerDied","Data":"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f"} Apr 16 18:27:49.901102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.901016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2" event={"ID":"362627f4-bd02-4e6f-aa32-4bc5fc2874f8","Type":"ContainerDied","Data":"2bcd19fdcc8e6adfbf3a18afd80327c379cf3445b4d988f7d84e3f0a94e8fead"} Apr 16 18:27:49.901102 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.901033 2577 scope.go:117] "RemoveContainer" containerID="dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f" Apr 16 18:27:49.909616 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.909599 2577 scope.go:117] "RemoveContainer" containerID="dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f" Apr 16 18:27:49.909853 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:27:49.909837 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f\": container with ID starting with dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f not found: ID does not exist" containerID="dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f" Apr 16 18:27:49.909900 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.909860 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f"} err="failed to get container status \"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f\": rpc error: code = NotFound desc = could not find container \"dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f\": container with ID starting with dae78465bdc0f5550a8c157dd7900389d3b021e5acc8af07546f748a2460ef7f not found: ID does not exist" Apr 16 18:27:49.922033 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.922004 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:27:49.924304 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:49.924279 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ccdbb-predictor-c7cfb5d-jr9f2"] Apr 16 18:27:50.478180 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:50.478149 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" path="/var/lib/kubelet/pods/362627f4-bd02-4e6f-aa32-4bc5fc2874f8/volumes" Apr 16 18:27:57.893886 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:27:57.893841 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:28:07.894707 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:07.894660 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:28:15.897630 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.897597 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:28:15.898093 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.897922 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" containerID="cri-o://8dfb7af59d3175df451684d653a125b424f01646d59d898f2577147757acd923" gracePeriod=30 Apr 16 18:28:15.941558 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.941523 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:28:15.942107 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.942093 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" Apr 16 18:28:15.942163 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.942112 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" Apr 16 18:28:15.942266 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.942252 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="362627f4-bd02-4e6f-aa32-4bc5fc2874f8" containerName="kserve-container" Apr 16 18:28:15.946464 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.946446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:28:15.955576 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.955546 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:28:15.957526 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:15.957504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:28:16.102639 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:16.102606 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:28:16.106270 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:28:16.106239 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2600d2_6eaa_4f3d_a32a_5df7538e4591.slice/crio-4a1c8e3840fc8de9794e972be413819a0c986cea872812f68e9343dc74cef9de WatchSource:0}: Error finding container 4a1c8e3840fc8de9794e972be413819a0c986cea872812f68e9343dc74cef9de: Status 404 returned error can't find the container with id 4a1c8e3840fc8de9794e972be413819a0c986cea872812f68e9343dc74cef9de Apr 16 18:28:17.000057 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.000012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" event={"ID":"4e2600d2-6eaa-4f3d-a32a-5df7538e4591","Type":"ContainerStarted","Data":"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc"} Apr 16 18:28:17.000057 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.000054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" event={"ID":"4e2600d2-6eaa-4f3d-a32a-5df7538e4591","Type":"ContainerStarted","Data":"4a1c8e3840fc8de9794e972be413819a0c986cea872812f68e9343dc74cef9de"} Apr 16 18:28:17.000644 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.000236 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:28:17.001731 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.001704 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:28:17.023157 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.023106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podStartSLOduration=2.023090791 podStartE2EDuration="2.023090791s" podCreationTimestamp="2026-04-16 18:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:17.020222812 +0000 UTC m=+1571.133030266" watchObservedRunningTime="2026-04-16 18:28:17.023090791 +0000 UTC m=+1571.135898247" Apr 16 18:28:17.710882 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.710845 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:28:17.894444 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:17.894403 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:28:18.003426 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:18.003315 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:28:19.008019 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:19.007991 2577 generic.go:358] "Generic (PLEG): container finished" podID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerID="8dfb7af59d3175df451684d653a125b424f01646d59d898f2577147757acd923" exitCode=0 Apr 16 18:28:19.008380 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:19.008062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" event={"ID":"99bcb2ca-e0ce-4d51-a4d9-428ae55d0989","Type":"ContainerDied","Data":"8dfb7af59d3175df451684d653a125b424f01646d59d898f2577147757acd923"} Apr 16 18:28:19.049379 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:19.049353 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:28:20.012820 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.012789 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" Apr 16 18:28:20.013277 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.012787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b" event={"ID":"99bcb2ca-e0ce-4d51-a4d9-428ae55d0989","Type":"ContainerDied","Data":"14ca7d0c521a104b8f6350c2415d386f4d063fdac98e2668eff760526d2c9ff7"} Apr 16 18:28:20.013277 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.012903 2577 scope.go:117] "RemoveContainer" containerID="8dfb7af59d3175df451684d653a125b424f01646d59d898f2577147757acd923" Apr 16 18:28:20.032128 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.032102 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:28:20.034231 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.034213 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-516af-predictor-5f8c67c956-pwv6b"] Apr 16 18:28:20.477290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:20.477260 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" path="/var/lib/kubelet/pods/99bcb2ca-e0ce-4d51-a4d9-428ae55d0989/volumes" Apr 16 18:28:27.894588 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:27.894539 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:28:28.003482 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:28.003435 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:28:37.894813 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:37.894773 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:28:38.003953 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:38.003907 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:28:48.004182 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:48.004138 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:28:58.004424 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:28:58.004336 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:29:08.004588 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:29:08.004542 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:32:06.489398 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:32:06.489350 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:32:06.495819 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:32:06.495800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:32:06.496448 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:32:06.496431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:32:06.502628 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:32:06.502612 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:37:06.514993 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:06.514957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:37:06.522258 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:06.522236 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:37:06.522421 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:06.522240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:37:06.529046 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:06.529027 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:37:10.954277 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.954246 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:37:10.954694 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.954507 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" containerID="cri-o://3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548" gracePeriod=30 Apr 16 18:37:10.991246 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.991217 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:37:10.991633 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.991618 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" Apr 16 18:37:10.991693 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.991637 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" Apr 16 18:37:10.991733 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.991727 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="99bcb2ca-e0ce-4d51-a4d9-428ae55d0989" containerName="kserve-container" Apr 16 18:37:10.994636 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:10.994618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:37:11.002357 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.002327 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:37:11.007056 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.007037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:37:11.142953 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.142924 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:37:11.145091 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:37:11.145065 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f23671_8785_4093_9eeb_d0d4a01e3309.slice/crio-5fb70559213a96837bfd2b3f17a249ed2d9f02029ff6d97a06df30dcbf4d4cc4 WatchSource:0}: Error finding container 5fb70559213a96837bfd2b3f17a249ed2d9f02029ff6d97a06df30dcbf4d4cc4: Status 404 returned error can't find the container with id 5fb70559213a96837bfd2b3f17a249ed2d9f02029ff6d97a06df30dcbf4d4cc4 Apr 16 18:37:11.147250 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.147233 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:37:11.874956 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.874924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" event={"ID":"88f23671-8785-4093-9eeb-d0d4a01e3309","Type":"ContainerStarted","Data":"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140"} Apr 16 18:37:11.874956 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.874958 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" event={"ID":"88f23671-8785-4093-9eeb-d0d4a01e3309","Type":"ContainerStarted","Data":"5fb70559213a96837bfd2b3f17a249ed2d9f02029ff6d97a06df30dcbf4d4cc4"} Apr 16 18:37:11.875174 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.875143 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:37:11.876199 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.876175 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:11.890745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:11.890703 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podStartSLOduration=1.8906895929999998 podStartE2EDuration="1.890689593s" podCreationTimestamp="2026-04-16 18:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:11.889226367 +0000 UTC m=+2106.002033836" watchObservedRunningTime="2026-04-16 18:37:11.890689593 +0000 UTC m=+2106.003497050" Apr 16 18:37:12.878439 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:12.878400 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:14.102816 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.102792 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:37:14.886507 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.886467 2577 generic.go:358] "Generic (PLEG): container finished" podID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerID="3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548" exitCode=0 Apr 16 18:37:14.886696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.886556 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" Apr 16 18:37:14.886696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.886556 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" event={"ID":"7554a6c0-607d-4699-ae6a-3825504b1d58","Type":"ContainerDied","Data":"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548"} Apr 16 18:37:14.886696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.886597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz" event={"ID":"7554a6c0-607d-4699-ae6a-3825504b1d58","Type":"ContainerDied","Data":"9bfd68fba8cc1f3c4faa5369be4e74d6f1804a4ef09635e7fd04b0e73e914438"} Apr 16 18:37:14.886696 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.886612 2577 scope.go:117] "RemoveContainer" containerID="3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548" Apr 16 18:37:14.894870 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.894844 2577 scope.go:117] "RemoveContainer" containerID="3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548" Apr 16 18:37:14.895109 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:37:14.895088 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548\": container with ID starting with 3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548 not found: ID does not exist" containerID="3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548" Apr 16 18:37:14.895154 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.895117 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548"} err="failed to get container status \"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548\": rpc error: code = NotFound desc = could not find container \"3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548\": container with ID starting with 3e2eee5cbf413a3633cd4c3a974a81900492d747d52c5ab0605eb027386cc548 not found: ID does not exist" Apr 16 18:37:14.901680 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.901657 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:37:14.905494 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:14.905473 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-df772-predictor-5c6cd9cb6b-6bgvz"] Apr 16 18:37:16.478207 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:16.478175 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" path="/var/lib/kubelet/pods/7554a6c0-607d-4699-ae6a-3825504b1d58/volumes" Apr 16 18:37:22.879128 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:22.879081 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:32.879142 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:32.879093 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:40.828563 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.828511 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:37:40.829172 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.828856 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" containerID="cri-o://d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc" gracePeriod=30 Apr 16 18:37:40.847589 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.847555 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:37:40.847950 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.847938 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" Apr 16 18:37:40.847996 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.847952 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" Apr 16 18:37:40.848031 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.848024 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7554a6c0-607d-4699-ae6a-3825504b1d58" containerName="kserve-container" Apr 16 18:37:40.850858 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.850839 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:37:40.856557 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.856519 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:37:40.863977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:40.863954 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:37:41.007626 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:41.007589 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:37:41.010429 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:37:41.010355 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889d9c6e_5de9_486e_9c66_ed99adb14530.slice/crio-3d3818152fe58fac317b6b2c8e07f66d8236de65098558811b4c67137cde4d99 WatchSource:0}: Error finding container 3d3818152fe58fac317b6b2c8e07f66d8236de65098558811b4c67137cde4d99: Status 404 returned error can't find the container with id 3d3818152fe58fac317b6b2c8e07f66d8236de65098558811b4c67137cde4d99 Apr 16 18:37:41.987677 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:41.987636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" event={"ID":"889d9c6e-5de9-486e-9c66-ed99adb14530","Type":"ContainerStarted","Data":"3575e7ed443d2f408f3818e42539fbf16449c64eec2f5a29e25641752d4d3bb2"} Apr 16 18:37:41.987677 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:41.987679 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" event={"ID":"889d9c6e-5de9-486e-9c66-ed99adb14530","Type":"ContainerStarted","Data":"3d3818152fe58fac317b6b2c8e07f66d8236de65098558811b4c67137cde4d99"} Apr 16 18:37:41.988136 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:41.987833 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:37:41.989016 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:41.988991 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:37:42.003464 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:42.003412 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podStartSLOduration=2.003396306 podStartE2EDuration="2.003396306s" podCreationTimestamp="2026-04-16 18:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:42.000879499 +0000 UTC m=+2136.113686953" watchObservedRunningTime="2026-04-16 18:37:42.003396306 +0000 UTC m=+2136.116203757" Apr 16 18:37:42.878894 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:42.878851 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:42.991041 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:42.990999 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:37:44.175583 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:44.175557 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:37:45.000053 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.000011 2577 generic.go:358] "Generic (PLEG): container finished" podID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerID="d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc" exitCode=0 Apr 16 18:37:45.000224 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.000072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" event={"ID":"4e2600d2-6eaa-4f3d-a32a-5df7538e4591","Type":"ContainerDied","Data":"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc"} Apr 16 18:37:45.000224 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.000102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" event={"ID":"4e2600d2-6eaa-4f3d-a32a-5df7538e4591","Type":"ContainerDied","Data":"4a1c8e3840fc8de9794e972be413819a0c986cea872812f68e9343dc74cef9de"} Apr 16 18:37:45.000224 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.000107 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg" Apr 16 18:37:45.000224 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.000121 2577 scope.go:117] "RemoveContainer" containerID="d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc" Apr 16 18:37:45.008233 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.008203 2577 scope.go:117] "RemoveContainer" containerID="d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc" Apr 16 18:37:45.008518 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:37:45.008497 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc\": container with ID starting with d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc not found: ID does not exist" containerID="d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc" Apr 16 18:37:45.008618 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.008531 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc"} err="failed to get container status \"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc\": rpc error: code = NotFound desc = could not find container \"d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc\": container with ID starting with d1163026b52c9ffc95aff3e5829e320c578bd5c08effc3c6441929ef31e478fc not found: ID does not exist" Apr 16 18:37:45.014544 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.014522 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:37:45.020156 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:45.020135 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ed24b-predictor-945b44ddd-lpdjg"] Apr 16 18:37:46.476981 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:46.476949 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" path="/var/lib/kubelet/pods/4e2600d2-6eaa-4f3d-a32a-5df7538e4591/volumes" Apr 16 18:37:52.879203 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:52.879116 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:37:52.990999 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:37:52.990963 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:38:02.880202 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:02.880166 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:38:02.991274 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:02.991231 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:38:12.991132 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:12.991084 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:38:22.991498 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:22.991453 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:38:31.276797 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.276755 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:38:31.277399 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.277002 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" containerID="cri-o://5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140" gracePeriod=30 Apr 16 18:38:31.294326 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.294290 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:38:31.294933 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.294916 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" Apr 16 18:38:31.295004 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.294936 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" Apr 16 18:38:31.295060 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.295043 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2600d2-6eaa-4f3d-a32a-5df7538e4591" containerName="kserve-container" Apr 16 18:38:31.299709 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.299686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:38:31.303845 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.303813 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:38:31.311071 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.311046 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:38:31.454412 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:31.454364 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:38:31.457151 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:38:31.457119 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52716b75_befe_4500_9562_fb6fdc3d46de.slice/crio-e9152859bb0674b7eef226ebabc8612b27012e339e22c435631b72bb31625c38 WatchSource:0}: Error finding container e9152859bb0674b7eef226ebabc8612b27012e339e22c435631b72bb31625c38: Status 404 returned error can't find the container with id e9152859bb0674b7eef226ebabc8612b27012e339e22c435631b72bb31625c38 Apr 16 18:38:32.177291 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.177248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" event={"ID":"52716b75-befe-4500-9562-fb6fdc3d46de","Type":"ContainerStarted","Data":"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d"} Apr 16 18:38:32.177291 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.177294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" event={"ID":"52716b75-befe-4500-9562-fb6fdc3d46de","Type":"ContainerStarted","Data":"e9152859bb0674b7eef226ebabc8612b27012e339e22c435631b72bb31625c38"} Apr 16 18:38:32.177614 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.177426 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:38:32.178702 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.178678 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:38:32.192674 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.192620 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podStartSLOduration=1.192605137 podStartE2EDuration="1.192605137s" podCreationTimestamp="2026-04-16 18:38:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:32.191393804 +0000 UTC m=+2186.304201250" watchObservedRunningTime="2026-04-16 18:38:32.192605137 +0000 UTC m=+2186.305412587" Apr 16 18:38:32.878752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.878702 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:38:32.992556 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:32.992520 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:38:33.181335 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:33.181247 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:38:34.732778 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:34.732755 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:38:35.189389 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.189330 2577 generic.go:358] "Generic (PLEG): container finished" podID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerID="5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140" exitCode=0 Apr 16 18:38:35.189572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.189414 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" Apr 16 18:38:35.189572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.189413 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" event={"ID":"88f23671-8785-4093-9eeb-d0d4a01e3309","Type":"ContainerDied","Data":"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140"} Apr 16 18:38:35.189572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.189458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w" event={"ID":"88f23671-8785-4093-9eeb-d0d4a01e3309","Type":"ContainerDied","Data":"5fb70559213a96837bfd2b3f17a249ed2d9f02029ff6d97a06df30dcbf4d4cc4"} Apr 16 18:38:35.189572 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.189475 2577 scope.go:117] "RemoveContainer" containerID="5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140" Apr 16 18:38:35.197916 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.197898 2577 scope.go:117] "RemoveContainer" containerID="5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140" Apr 16 18:38:35.198216 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:38:35.198195 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140\": container with ID starting with 5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140 not found: ID does not exist" containerID="5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140" Apr 16 18:38:35.198280 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.198225 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140"} err="failed to get container status \"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140\": rpc error: code = NotFound desc = could not find container \"5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140\": container with ID starting with 5d7891999396e52febdff78ee6770f9a1d3fdcf96507e94cbd78742130263140 not found: ID does not exist" Apr 16 18:38:35.210919 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.210883 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:38:35.215150 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:35.215115 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-78581-predictor-8648c458b4-v2c7w"] Apr 16 18:38:36.477863 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:36.477827 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" path="/var/lib/kubelet/pods/88f23671-8785-4093-9eeb-d0d4a01e3309/volumes" Apr 16 18:38:43.182127 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:43.182083 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:38:53.181632 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:38:53.181588 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:39:03.181476 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:39:03.181436 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:39:13.181453 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:39:13.181414 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 18:39:23.183188 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:39:23.183150 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:42:06.538865 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:42:06.538837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:42:06.545933 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:42:06.545913 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:42:06.549398 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:42:06.549380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:42:06.556465 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:42:06.556450 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:47:06.563600 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:06.563520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:47:06.570784 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:06.570760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:47:06.578987 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:06.578968 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:47:06.585726 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:06.585711 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:47:56.154446 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:56.154414 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:47:56.154972 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:56.154634 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" containerID="cri-o://22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d" gracePeriod=30 Apr 16 18:47:58.888978 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:58.888955 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:47:59.151110 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.151075 2577 generic.go:358] "Generic (PLEG): container finished" podID="52716b75-befe-4500-9562-fb6fdc3d46de" containerID="22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d" exitCode=0 Apr 16 18:47:59.151290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.151140 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" Apr 16 18:47:59.151290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.151168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" event={"ID":"52716b75-befe-4500-9562-fb6fdc3d46de","Type":"ContainerDied","Data":"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d"} Apr 16 18:47:59.151290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.151209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5" event={"ID":"52716b75-befe-4500-9562-fb6fdc3d46de","Type":"ContainerDied","Data":"e9152859bb0674b7eef226ebabc8612b27012e339e22c435631b72bb31625c38"} Apr 16 18:47:59.151290 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.151227 2577 scope.go:117] "RemoveContainer" containerID="22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d" Apr 16 18:47:59.159545 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.159529 2577 scope.go:117] "RemoveContainer" containerID="22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d" Apr 16 18:47:59.159794 ip-10-0-134-133 kubenswrapper[2577]: E0416 18:47:59.159778 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d\": container with ID starting with 22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d not found: ID does not exist" containerID="22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d" Apr 16 18:47:59.159840 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.159804 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d"} err="failed to get container status \"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d\": rpc error: code = NotFound desc = could not find container \"22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d\": container with ID starting with 22b4652dff51e53c99671d7e758bdb6bca02142366f93c083b4eac8b6193158d not found: ID does not exist" Apr 16 18:47:59.171259 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.171238 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:47:59.174231 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:47:59.174211 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-84e06-predictor-59d6c4b5b8-qwgj5"] Apr 16 18:48:00.477824 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:48:00.477792 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" path="/var/lib/kubelet/pods/52716b75-befe-4500-9562-fb6fdc3d46de/volumes" Apr 16 18:52:06.589976 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:52:06.589953 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:52:06.597601 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:52:06.597580 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:52:06.604591 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:52:06.604576 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:52:06.611016 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:52:06.610998 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:55:10.459066 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:10.459027 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:55:10.459578 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:10.459337 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" containerID="cri-o://3575e7ed443d2f408f3818e42539fbf16449c64eec2f5a29e25641752d4d3bb2" gracePeriod=30 Apr 16 18:55:12.991189 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:12.991141 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:55:13.643861 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:13.643825 2577 generic.go:358] "Generic (PLEG): container finished" podID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerID="3575e7ed443d2f408f3818e42539fbf16449c64eec2f5a29e25641752d4d3bb2" exitCode=0 Apr 16 18:55:13.644012 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:13.643858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" event={"ID":"889d9c6e-5de9-486e-9c66-ed99adb14530","Type":"ContainerDied","Data":"3575e7ed443d2f408f3818e42539fbf16449c64eec2f5a29e25641752d4d3bb2"} Apr 16 18:55:13.699494 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:13.699458 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:55:14.648417 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:14.648386 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" Apr 16 18:55:14.648417 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:14.648392 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr" event={"ID":"889d9c6e-5de9-486e-9c66-ed99adb14530","Type":"ContainerDied","Data":"3d3818152fe58fac317b6b2c8e07f66d8236de65098558811b4c67137cde4d99"} Apr 16 18:55:14.648904 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:14.648440 2577 scope.go:117] "RemoveContainer" containerID="3575e7ed443d2f408f3818e42539fbf16449c64eec2f5a29e25641752d4d3bb2" Apr 16 18:55:14.664021 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:14.663995 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:55:14.667909 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:14.667885 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-89f53-predictor-7b489589b4-4c4pr"] Apr 16 18:55:16.477924 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:16.477890 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" path="/var/lib/kubelet/pods/889d9c6e-5de9-486e-9c66-ed99adb14530/volumes" Apr 16 18:55:36.085823 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.085781 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vt288/must-gather-smhc9"] Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086154 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086165 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086184 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086190 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086198 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086203 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086270 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="52716b75-befe-4500-9562-fb6fdc3d46de" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086282 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="889d9c6e-5de9-486e-9c66-ed99adb14530" containerName="kserve-container" Apr 16 18:55:36.086302 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.086289 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="88f23671-8785-4093-9eeb-d0d4a01e3309" containerName="kserve-container" Apr 16 18:55:36.090749 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.090729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.092810 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.092776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vt288\"/\"kube-root-ca.crt\"" Apr 16 18:55:36.092810 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.092790 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vt288\"/\"openshift-service-ca.crt\"" Apr 16 18:55:36.093252 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.093234 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vt288\"/\"default-dockercfg-nbxpr\"" Apr 16 18:55:36.096135 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.096109 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/must-gather-smhc9"] Apr 16 18:55:36.179570 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.179531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp6hw\" (UniqueName: \"kubernetes.io/projected/f92d9c2e-2745-4924-be87-5ee6578115d3-kube-api-access-lp6hw\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.179570 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.179568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f92d9c2e-2745-4924-be87-5ee6578115d3-must-gather-output\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.280510 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.280454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp6hw\" (UniqueName: \"kubernetes.io/projected/f92d9c2e-2745-4924-be87-5ee6578115d3-kube-api-access-lp6hw\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.280510 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.280518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f92d9c2e-2745-4924-be87-5ee6578115d3-must-gather-output\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.280882 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.280860 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f92d9c2e-2745-4924-be87-5ee6578115d3-must-gather-output\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.289342 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.289308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp6hw\" (UniqueName: \"kubernetes.io/projected/f92d9c2e-2745-4924-be87-5ee6578115d3-kube-api-access-lp6hw\") pod \"must-gather-smhc9\" (UID: \"f92d9c2e-2745-4924-be87-5ee6578115d3\") " pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.408705 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.408601 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/must-gather-smhc9" Apr 16 18:55:36.542349 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.542314 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/must-gather-smhc9"] Apr 16 18:55:36.545121 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:55:36.545090 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf92d9c2e_2745_4924_be87_5ee6578115d3.slice/crio-e82b1fb83de90e2bcca27e319d9a82f3e44ab31c21ebbd6daff9836f28897089 WatchSource:0}: Error finding container e82b1fb83de90e2bcca27e319d9a82f3e44ab31c21ebbd6daff9836f28897089: Status 404 returned error can't find the container with id e82b1fb83de90e2bcca27e319d9a82f3e44ab31c21ebbd6daff9836f28897089 Apr 16 18:55:36.546845 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.546823 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:55:36.728279 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:36.728188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/must-gather-smhc9" event={"ID":"f92d9c2e-2745-4924-be87-5ee6578115d3","Type":"ContainerStarted","Data":"e82b1fb83de90e2bcca27e319d9a82f3e44ab31c21ebbd6daff9836f28897089"} Apr 16 18:55:37.734183 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:37.734150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/must-gather-smhc9" event={"ID":"f92d9c2e-2745-4924-be87-5ee6578115d3","Type":"ContainerStarted","Data":"4c792fba3454d532fb276521b4c164f88cd91c5982bb048d95ce4926ff987ba0"} Apr 16 18:55:37.734183 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:37.734188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/must-gather-smhc9" event={"ID":"f92d9c2e-2745-4924-be87-5ee6578115d3","Type":"ContainerStarted","Data":"87b7bfb8d7d9a6ed9b77118f66de93520ad4cba8567426eee560cb28f05467b1"} Apr 16 18:55:37.750032 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:37.749980 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vt288/must-gather-smhc9" podStartSLOduration=0.976333366 podStartE2EDuration="1.749964317s" podCreationTimestamp="2026-04-16 18:55:36 +0000 UTC" firstStartedPulling="2026-04-16 18:55:36.547012343 +0000 UTC m=+3210.659819790" lastFinishedPulling="2026-04-16 18:55:37.320643308 +0000 UTC m=+3211.433450741" observedRunningTime="2026-04-16 18:55:37.748895632 +0000 UTC m=+3211.861703085" watchObservedRunningTime="2026-04-16 18:55:37.749964317 +0000 UTC m=+3211.862771770" Apr 16 18:55:38.846905 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:38.846856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bsftx_842ea51c-5928-423a-9820-b4041ccdbe7b/global-pull-secret-syncer/0.log" Apr 16 18:55:38.961117 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:38.961089 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4spvw_f0debde6-1ccc-484b-a994-63e26bc909b9/konnectivity-agent/0.log" Apr 16 18:55:39.071773 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:39.071742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-133.ec2.internal_550cd3d62a888e29b98d67015acd7a34/haproxy/0.log" Apr 16 18:55:42.340799 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.340744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/alertmanager/0.log" Apr 16 18:55:42.366358 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.366325 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/config-reloader/0.log" Apr 16 18:55:42.391442 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.391414 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/kube-rbac-proxy-web/0.log" Apr 16 18:55:42.419295 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.419260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/kube-rbac-proxy/0.log" Apr 16 18:55:42.450319 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.450291 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/kube-rbac-proxy-metric/0.log" Apr 16 18:55:42.479477 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.479324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/prom-label-proxy/0.log" Apr 16 18:55:42.501010 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.500970 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16a93748-4af4-4427-b39e-657b0f9ac96b/init-config-reloader/0.log" Apr 16 18:55:42.571561 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.571524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-cmpk5_15702bb5-aa4d-4152-b4a2-faadc3c7fa5f/cluster-monitoring-operator/0.log" Apr 16 18:55:42.595757 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.595663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g5cj2_070020cc-e67b-4965-a3df-3cf85fed6a85/kube-state-metrics/0.log" Apr 16 18:55:42.619118 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.619089 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g5cj2_070020cc-e67b-4965-a3df-3cf85fed6a85/kube-rbac-proxy-main/0.log" Apr 16 18:55:42.643680 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.643649 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-g5cj2_070020cc-e67b-4965-a3df-3cf85fed6a85/kube-rbac-proxy-self/0.log" Apr 16 18:55:42.699417 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.699360 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-kstnj_dcbf1809-c5f1-459c-a1bc-66069006fd9a/monitoring-plugin/0.log" Apr 16 18:55:42.816095 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.816028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l6ll_69c059ad-8285-4260-ab9e-9163abfdcada/node-exporter/0.log" Apr 16 18:55:42.835785 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.835758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l6ll_69c059ad-8285-4260-ab9e-9163abfdcada/kube-rbac-proxy/0.log" Apr 16 18:55:42.856417 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.856330 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7l6ll_69c059ad-8285-4260-ab9e-9163abfdcada/init-textfile/0.log" Apr 16 18:55:42.956126 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.956081 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kl8xd_45727ce7-e2c5-48b0-b000-4e32997b56df/kube-rbac-proxy-main/0.log" Apr 16 18:55:42.984761 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:42.984722 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kl8xd_45727ce7-e2c5-48b0-b000-4e32997b56df/kube-rbac-proxy-self/0.log" Apr 16 18:55:43.013734 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:43.013704 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kl8xd_45727ce7-e2c5-48b0-b000-4e32997b56df/openshift-state-metrics/0.log" Apr 16 18:55:43.270089 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:43.269992 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-tcjww_5b420faf-ed52-4f86-9d02-a3f48f948b9e/prometheus-operator-admission-webhook/0.log" Apr 16 18:55:44.712137 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:44.712095 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-bsnjn_69a0864a-c403-4e89-a598-d2a7ec22d2fc/networking-console-plugin/0.log" Apr 16 18:55:45.160113 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:45.160080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/2.log" Apr 16 18:55:45.165603 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:45.165574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6g6wf_39964503-ddba-4c2e-9063-e712eb49041b/console-operator/3.log" Apr 16 18:55:45.555879 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:45.555809 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55c796ccb9-hvjzr_f8d65037-b15a-4ca6-ab9c-0d31940528fe/console/0.log" Apr 16 18:55:45.589929 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:45.589886 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-l4wg4_fa774bed-dcba-4e18-8dfe-ae8bef67b1d1/download-server/0.log" Apr 16 18:55:45.995701 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:45.995674 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-5crmb_bbff1a65-2590-46a2-b70c-3bc7271945eb/volume-data-source-validator/0.log" Apr 16 18:55:46.089627 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.089580 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn"] Apr 16 18:55:46.094865 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.094839 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.103754 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.103724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn"] Apr 16 18:55:46.179095 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.179064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-sys\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.179270 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.179104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hfm\" (UniqueName: \"kubernetes.io/projected/28035b8e-9da8-438b-b2eb-b8cc84380103-kube-api-access-78hfm\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.179270 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.179221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-lib-modules\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.179270 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.179255 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-proc\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.179415 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.179281 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-podres\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280303 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-sys\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280303 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78hfm\" (UniqueName: \"kubernetes.io/projected/28035b8e-9da8-438b-b2eb-b8cc84380103-kube-api-access-78hfm\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-lib-modules\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-sys\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-proc\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-lib-modules\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280560 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-podres\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-proc\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.280855 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.280733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/28035b8e-9da8-438b-b2eb-b8cc84380103-podres\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.289292 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.289255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hfm\" (UniqueName: \"kubernetes.io/projected/28035b8e-9da8-438b-b2eb-b8cc84380103-kube-api-access-78hfm\") pod \"perf-node-gather-daemonset-lmqcn\" (UID: \"28035b8e-9da8-438b-b2eb-b8cc84380103\") " pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.407283 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.407240 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.549832 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.549788 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn"] Apr 16 18:55:46.552959 ip-10-0-134-133 kubenswrapper[2577]: W0416 18:55:46.552924 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod28035b8e_9da8_438b_b2eb_b8cc84380103.slice/crio-e9bb108c29477e36f33748a1bc1508ba86b3eadc45cdc5c2e15941431d3639ff WatchSource:0}: Error finding container e9bb108c29477e36f33748a1bc1508ba86b3eadc45cdc5c2e15941431d3639ff: Status 404 returned error can't find the container with id e9bb108c29477e36f33748a1bc1508ba86b3eadc45cdc5c2e15941431d3639ff Apr 16 18:55:46.719022 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.718989 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77v8t_6429bf79-1554-458a-8ed2-de631c73ca89/dns/0.log" Apr 16 18:55:46.741908 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.741880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-77v8t_6429bf79-1554-458a-8ed2-de631c73ca89/kube-rbac-proxy/0.log" Apr 16 18:55:46.779016 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.778983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" event={"ID":"28035b8e-9da8-438b-b2eb-b8cc84380103","Type":"ContainerStarted","Data":"dfe35c49b95f794fb8d37c4f97d12b771754b552014d3cb890010d0f11556694"} Apr 16 18:55:46.779016 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.779017 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" event={"ID":"28035b8e-9da8-438b-b2eb-b8cc84380103","Type":"ContainerStarted","Data":"e9bb108c29477e36f33748a1bc1508ba86b3eadc45cdc5c2e15941431d3639ff"} Apr 16 18:55:46.779241 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.779109 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:46.796866 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.796805 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" podStartSLOduration=0.796759316 podStartE2EDuration="796.759316ms" podCreationTimestamp="2026-04-16 18:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:55:46.794864647 +0000 UTC m=+3220.907672100" watchObservedRunningTime="2026-04-16 18:55:46.796759316 +0000 UTC m=+3220.909566770" Apr 16 18:55:46.903658 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:46.903628 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qrflg_be8a29c6-c9c8-407b-9a79-1120ab614958/dns-node-resolver/0.log" Apr 16 18:55:47.472614 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:47.472561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-s2zfb_24b24bc6-a399-4980-9de1-8258c56623b3/node-ca/0.log" Apr 16 18:55:48.591604 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:48.591571 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8rjlz_176aef22-2713-42bf-81d6-9602a79bf10f/serve-healthcheck-canary/0.log" Apr 16 18:55:49.006506 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:49.006409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-lbrtk_879e1a6e-7abe-4b70-9fdd-76b30b854006/insights-operator/0.log" Apr 16 18:55:49.007541 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:49.007520 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-lbrtk_879e1a6e-7abe-4b70-9fdd-76b30b854006/insights-operator/1.log" Apr 16 18:55:49.174661 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:49.174633 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sl5k7_4c10821a-0056-41c9-86ef-a22224af6e3c/kube-rbac-proxy/0.log" Apr 16 18:55:49.194422 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:49.194387 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sl5k7_4c10821a-0056-41c9-86ef-a22224af6e3c/exporter/0.log" Apr 16 18:55:49.213936 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:49.213901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sl5k7_4c10821a-0056-41c9-86ef-a22224af6e3c/extractor/0.log" Apr 16 18:55:51.237643 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:51.237564 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-2xfjn_88b503cc-4043-4ece-9a71-22032bc4097f/manager/0.log" Apr 16 18:55:51.259550 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:51.259522 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-96vhg_250ecb81-0337-4f25-8bb2-bfb98eb93768/server/0.log" Apr 16 18:55:52.794819 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:52.794789 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vt288/perf-node-gather-daemonset-lmqcn" Apr 16 18:55:55.984745 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:55.984718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-z9bt2_22606e36-9830-4f6a-a7e5-3577a6591eb5/migrator/0.log" Apr 16 18:55:56.008288 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:56.008263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-z9bt2_22606e36-9830-4f6a-a7e5-3577a6591eb5/graceful-termination/0.log" Apr 16 18:55:56.387461 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:56.387423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-zrbs6_71948437-c3bf-419e-8170-14db67f520f4/kube-storage-version-migrator-operator/1.log" Apr 16 18:55:56.388758 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:56.388732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-zrbs6_71948437-c3bf-419e-8170-14db67f520f4/kube-storage-version-migrator-operator/0.log" Apr 16 18:55:57.413394 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.413352 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/kube-multus-additional-cni-plugins/0.log" Apr 16 18:55:57.434817 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.434742 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/egress-router-binary-copy/0.log" Apr 16 18:55:57.454439 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.454407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/cni-plugins/0.log" Apr 16 18:55:57.475442 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.475410 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/bond-cni-plugin/0.log" Apr 16 18:55:57.496752 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.496719 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/routeoverride-cni/0.log" Apr 16 18:55:57.519289 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.519259 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/whereabouts-cni-bincopy/0.log" Apr 16 18:55:57.542071 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.542042 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-969sx_42f0f7c4-f605-4e8b-a431-64e78857571a/whereabouts-cni/0.log" Apr 16 18:55:57.733746 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.733655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqqlw_fe56bca4-2974-4d8d-a069-7f2e617e5495/kube-multus/0.log" Apr 16 18:55:57.895977 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.895943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-znzwl_f009e89a-5e15-4d47-81de-24ab98cb437b/network-metrics-daemon/0.log" Apr 16 18:55:57.921117 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:57.921086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-znzwl_f009e89a-5e15-4d47-81de-24ab98cb437b/kube-rbac-proxy/0.log" Apr 16 18:55:58.712603 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.712540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-controller/0.log" Apr 16 18:55:58.730841 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.730805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/0.log" Apr 16 18:55:58.752328 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.752287 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovn-acl-logging/1.log" Apr 16 18:55:58.772916 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.772884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/kube-rbac-proxy-node/0.log" Apr 16 18:55:58.793534 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.793501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:55:58.811463 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.811439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/northd/0.log" Apr 16 18:55:58.838652 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.836962 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/nbdb/0.log" Apr 16 18:55:58.858704 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.858672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/sbdb/0.log" Apr 16 18:55:58.977282 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:55:58.977204 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ptwtr_0041c0cb-37ba-4e2d-8ab3-73fe90eb40df/ovnkube-controller/0.log" Apr 16 18:56:00.528967 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:56:00.528941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-6dpmh_8e7aafa9-3cde-4bad-8aa1-447548a5edaa/check-endpoints/0.log" Apr 16 18:56:00.583435 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:56:00.583410 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-dfdlg_eb29446e-bb65-416b-a40d-d985b58d7505/network-check-target-container/0.log" Apr 16 18:56:01.539409 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:56:01.539338 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-spc8d_d777df23-9c77-4ee3-a1ad-07ef46670681/iptables-alerter/0.log" Apr 16 18:56:02.266834 ip-10-0-134-133 kubenswrapper[2577]: I0416 18:56:02.266791 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-m8mjb_0897086a-3f20-4bf5-8811-04e196266bdf/tuned/0.log"