Apr 23 13:32:12.507509 ip-10-0-141-176 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:32:12.926057 ip-10-0-141-176 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:12.926057 ip-10-0-141-176 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:32:12.926057 ip-10-0-141-176 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:12.926057 ip-10-0-141-176 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:32:12.926057 ip-10-0-141-176 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:12.927987 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.927893 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:32:12.934162 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934144 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:12.934162 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934161 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934167 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934171 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934174 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934177 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934180 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934184 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934187 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934191 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934195 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934198 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934201 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934203 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934206 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934209 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934219 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934222 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934235 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934237 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934240 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:12.934245 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934243 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934246 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934249 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934252 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934254 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934257 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934260 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934263 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934266 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934268 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934271 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934273 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934276 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934279 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934281 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934284 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934286 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934289 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934291 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934293 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:12.934714 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934296 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934298 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934302 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934305 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934308 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934310 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934313 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934316 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934325 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934328 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934330 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934333 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934336 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934338 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934341 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934344 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934347 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934349 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934352 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934354 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:12.935270 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934357 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934360 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934368 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934372 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934375 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934378 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934381 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934383 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934386 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934388 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934391 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934394 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934396 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934398 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934401 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934403 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934406 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934408 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934411 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:12.935746 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934413 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934416 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934424 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934427 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934430 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934432 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934842 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934848 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934851 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934853 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934856 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934859 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934861 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934864 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934866 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934869 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934871 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934874 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934877 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934879 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:12.936215 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934883 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934886 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934889 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934892 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934895 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934897 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934900 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934903 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934907 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934910 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934912 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934915 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934917 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934920 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934923 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934926 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934928 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934931 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934934 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:12.936798 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934938 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934940 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934943 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934945 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934948 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934950 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934953 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934955 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934958 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934960 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934963 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934965 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934968 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934970 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934973 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934976 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934979 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934982 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934984 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:12.937283 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934987 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934991 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934995 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.934998 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935000 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935003 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935006 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935008 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935011 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935014 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935017 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935019 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935021 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935024 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935026 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935029 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935031 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935034 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935036 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:12.937757 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935039 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935041 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935044 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935046 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935049 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935051 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935053 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935056 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935059 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935061 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935064 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935067 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935070 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935072 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.935075 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936302 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936312 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936319 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936324 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936329 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936333 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:32:12.938234 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936337 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936342 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936346 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936349 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936353 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936356 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936360 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936363 2567 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936366 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936369 2567 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936372 2567 flags.go:64] FLAG: --cloud-config="" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936374 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936377 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936382 2567 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936385 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936388 2567 flags.go:64] FLAG: --config-dir="" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936391 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936394 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936398 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936402 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936405 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936408 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936411 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936414 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:32:12.938751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936417 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936421 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936439 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936445 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936448 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936452 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936455 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936459 2567 flags.go:64] FLAG: --enable-server="true" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936462 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936467 2567 flags.go:64] FLAG: --event-burst="100" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936470 2567 flags.go:64] FLAG: --event-qps="50" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936474 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936477 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936480 2567 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936484 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936487 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936491 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936494 2567 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936497 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936500 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936503 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936506 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936510 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936513 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936516 2567 flags.go:64] FLAG: --feature-gates="" Apr 23 13:32:12.939346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936520 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936523 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936526 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936529 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936532 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936535 2567 flags.go:64] FLAG: --help="false" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936538 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-141-176.ec2.internal" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936542 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936545 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936548 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936552 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936555 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936559 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936561 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936564 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936568 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936571 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936574 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936577 2567 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936580 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936583 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936586 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936589 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936592 2567 flags.go:64] FLAG: --lock-file="" Apr 23 13:32:12.939950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936594 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936597 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936601 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936606 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936609 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936614 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936617 2567 flags.go:64] FLAG: --logging-format="text" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936620 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936623 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936626 2567 flags.go:64] FLAG: --manifest-url="" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936629 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936633 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936637 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936641 2567 flags.go:64] FLAG: --max-pods="110" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936644 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936647 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936650 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936654 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936657 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936660 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936662 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936670 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936674 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936677 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:32:12.940572 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936680 2567 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936682 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936688 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936690 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936694 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936697 2567 flags.go:64] FLAG: --port="10250" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936700 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936703 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-038ce8717d3c98cad" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936706 2567 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936709 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936712 2567 flags.go:64] FLAG: --register-node="true" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936715 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936718 2567 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936724 2567 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936727 2567 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936730 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936733 2567 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936737 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936740 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936743 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936746 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936749 2567 flags.go:64] FLAG: --runonce="false" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936752 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936755 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936758 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:32:12.941145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936761 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936764 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936767 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936770 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936773 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936776 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936779 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936782 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936785 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936788 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936790 2567 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936794 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936800 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936803 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936806 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936810 2567 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936814 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936816 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936819 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936822 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936827 2567 flags.go:64] FLAG: --v="2" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936831 2567 flags.go:64] FLAG: --version="false" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936836 2567 flags.go:64] FLAG: --vmodule="" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936840 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.936843 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:32:12.941751 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936941 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936945 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936948 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936953 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936957 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936960 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936963 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936966 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936968 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936971 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936974 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936977 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936979 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936982 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936984 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936987 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936990 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936992 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936995 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:12.942352 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.936997 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937000 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937003 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937005 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937008 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937010 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937013 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937015 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937020 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937022 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937025 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937027 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937030 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937032 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937035 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937037 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937041 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937045 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937049 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:12.942794 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937052 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937054 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937058 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937060 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937063 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937066 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937068 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937071 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937073 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937076 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937078 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937081 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937083 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937086 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937088 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937091 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937094 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937096 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937099 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937101 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:12.943321 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937104 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937108 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937111 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937113 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937116 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937118 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937121 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937124 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937126 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937129 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937131 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937134 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937137 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937141 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937143 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937146 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937148 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937151 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937153 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937156 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:12.943852 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937158 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937161 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937163 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937166 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937169 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937171 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937173 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.937176 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:12.944481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.937974 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:12.947137 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.947104 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:32:12.947137 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.947134 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947187 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947194 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947198 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947201 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947205 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947208 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947211 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947214 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947217 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947220 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947240 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947245 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947249 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947253 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947256 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947259 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947262 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947266 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947269 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:12.947287 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947272 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947274 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947277 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947280 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947282 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947285 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947288 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947290 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947293 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947296 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947299 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947302 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947305 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947307 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947310 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947313 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947315 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947317 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947320 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:12.947785 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947323 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947327 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947332 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947335 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947338 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947341 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947344 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947347 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947349 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947352 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947355 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947358 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947360 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947363 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947366 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947369 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947372 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947375 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947378 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:12.948345 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947381 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947383 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947386 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947389 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947391 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947394 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947396 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947399 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947402 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947404 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947407 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947409 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947412 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947415 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947417 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947420 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947422 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947425 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947427 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947431 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:12.948810 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947433 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947436 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947440 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947443 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947446 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947448 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947451 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947454 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947457 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.947462 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947563 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947568 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947571 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947574 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947577 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:12.949318 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947580 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947583 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947585 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947588 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947590 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947593 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947596 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947598 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947601 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947603 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947606 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947608 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947611 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947613 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947616 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947618 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947621 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947624 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947627 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947629 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:12.949693 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947632 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947635 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947637 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947641 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947644 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947647 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947650 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947655 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947658 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947661 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947664 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947666 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947669 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947672 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947675 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947677 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947680 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947682 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:12.950212 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947685 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947687 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947690 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947692 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947695 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947697 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947700 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947702 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947705 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947707 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947710 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947712 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947716 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947719 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947721 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947724 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947726 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947728 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947731 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947734 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:12.950660 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947736 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947739 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947741 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947744 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947746 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947749 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947751 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947754 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947756 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947758 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947761 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947764 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947766 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947769 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947771 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947774 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947776 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947779 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947781 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947784 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:12.951158 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947786 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:12.951725 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947789 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:12.951725 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:12.947792 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:12.951725 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.947797 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:12.951725 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.948620 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:32:12.952579 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.952563 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:32:12.953431 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.953418 2567 server.go:1019] "Starting client certificate rotation" Apr 23 13:32:12.953540 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.953520 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:12.953591 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.953573 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:12.977368 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.977347 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:12.982689 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.982672 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:12.995841 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:12.995817 2567 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:32:13.001482 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.001463 2567 log.go:25] "Validated CRI v1 image API" Apr 23 13:32:13.002788 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.002771 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:32:13.007338 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.007308 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7cb3466e-b6ea-48de-ab84-c8786b7ffc18:/dev/nvme0n1p3 7cf1e795-dc53-4cfa-82b2-c80eafe95180:/dev/nvme0n1p4] Apr 23 13:32:13.007443 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.007336 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:32:13.009094 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.009074 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:13.013762 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.013644 2567 manager.go:217] Machine: {Timestamp:2026-04-23 13:32:13.011701376 +0000 UTC m=+0.394230498 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100282 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23ebfbe13f7f4c7562596a9b0dd644 SystemUUID:ec23ebfb-e13f-7f4c-7562-596a9b0dd644 BootID:1f613a12-a8b8-41e5-bad1-64cd38c219f0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:26:31:2f:4b:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:26:31:2f:4b:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:bc:56:a6:b2:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:32:13.013762 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.013750 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:32:13.013920 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.013852 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:32:13.016307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.016281 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:32:13.016457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.016309 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-176.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:32:13.016500 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.016467 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:32:13.016500 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.016477 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:32:13.016500 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.016490 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:13.017246 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.017220 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:13.018256 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.018245 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:13.018379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.018369 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:32:13.020841 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.020830 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:32:13.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.020846 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:32:13.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.020859 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:32:13.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.020869 2567 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:32:13.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.020879 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:32:13.021937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.021925 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:13.021976 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.021944 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:13.024883 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.024861 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:32:13.024971 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.024904 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zrjcf" Apr 23 13:32:13.026980 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.026962 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:32:13.028351 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028336 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:32:13.028351 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028354 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028360 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028366 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028372 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028378 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028384 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028389 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028397 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028404 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028413 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:32:13.028457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.028422 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:32:13.030441 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.030429 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:32:13.030489 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.030445 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:32:13.030677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.030661 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zrjcf" Apr 23 13:32:13.033257 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.033215 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:32:13.033334 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.033217 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-176.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:32:13.035166 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.035152 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:32:13.035210 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.035194 2567 server.go:1295] "Started kubelet" Apr 23 13:32:13.035312 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.035288 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:32:13.035397 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.035359 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:32:13.035457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.035427 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:32:13.036047 ip-10-0-141-176 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:32:13.036424 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.036408 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:32:13.037590 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.037576 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:32:13.042300 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.042284 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:32:13.042620 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.042601 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:13.043313 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.043293 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:32:13.043313 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.043316 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:32:13.043454 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.043422 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:32:13.043513 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.043488 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:32:13.043513 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.043496 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:32:13.043729 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.043693 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.044585 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044569 2567 factory.go:153] Registering CRI-O factory Apr 23 13:32:13.044685 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044596 2567 factory.go:223] Registration of the crio container factory successfully Apr 23 13:32:13.044685 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044663 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:32:13.044685 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044674 2567 factory.go:55] Registering systemd factory Apr 23 13:32:13.044685 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044683 2567 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:32:13.044871 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044704 2567 factory.go:103] Registering Raw factory Apr 23 13:32:13.044871 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.044717 2567 manager.go:1196] Started watching for new ooms in manager Apr 23 13:32:13.045088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.045069 2567 manager.go:319] Starting recovery of all containers Apr 23 13:32:13.045206 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.045190 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.056619 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.056375 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:32:13.056918 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.056863 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-176.ec2.internal" not found Apr 23 13:32:13.057505 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.057480 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-176.ec2.internal\" not found" node="ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.061653 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.061631 2567 manager.go:324] Recovery completed Apr 23 13:32:13.062939 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.062919 2567 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 23 13:32:13.066193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.066180 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.068573 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.068558 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.068661 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.068591 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.068661 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.068607 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.069083 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.069070 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:32:13.069143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.069085 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:32:13.069143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.069103 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:13.071491 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.071478 2567 policy_none.go:49] "None policy: Start" Apr 23 13:32:13.071536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.071496 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:32:13.071536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.071507 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:32:13.073188 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.073171 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-176.ec2.internal" not found Apr 23 13:32:13.112883 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.112863 2567 manager.go:341] "Starting Device Plugin manager" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.112903 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.112919 2567 server.go:85] "Starting device plugin registration server" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.113174 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.113186 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.113340 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.113433 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.113442 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.113884 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:32:13.130520 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.113927 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.134448 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.134429 2567 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-176.ec2.internal" not found Apr 23 13:32:13.205122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.205040 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:32:13.205122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.205077 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:32:13.205122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.205099 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:32:13.205122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.205106 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:32:13.205425 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.205137 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:32:13.207586 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.207554 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.213444 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.213424 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.214300 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.214286 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.214384 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.214316 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.214384 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.214327 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.214384 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.214351 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.220470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.220451 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.220571 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.220475 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-176.ec2.internal\": node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.235400 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.235380 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.306059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.305975 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal"] Apr 23 13:32:13.306193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.306106 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.307080 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.307068 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.307145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.307096 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.307145 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.307113 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.308841 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.308827 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.309007 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.308993 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.309044 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309022 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.309580 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309561 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.309677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309588 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.309677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309606 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.309677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309609 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.309677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309626 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.309677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.309638 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.311160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.311145 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.311205 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.311180 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.311852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.311834 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.311979 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.311862 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.311979 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.311876 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.335464 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.335439 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.339007 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.338991 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-176.ec2.internal\" not found" node="ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.343603 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.343588 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-176.ec2.internal\" not found" node="ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.344360 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.344343 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.344438 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.344375 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/829c0bd398638defff6c06e99f548781-config\") pod \"kube-apiserver-proxy-ip-10-0-141-176.ec2.internal\" (UID: \"829c0bd398638defff6c06e99f548781\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.344438 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.344412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.435675 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.435638 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.444947 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.444921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.445049 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.444956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.445049 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.444981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/829c0bd398638defff6c06e99f548781-config\") pod \"kube-apiserver-proxy-ip-10-0-141-176.ec2.internal\" (UID: \"829c0bd398638defff6c06e99f548781\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.445049 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.445009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.445049 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.445042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c43c6c4a8456b6c709608d0d7c51f49-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal\" (UID: \"0c43c6c4a8456b6c709608d0d7c51f49\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.445187 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.445012 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/829c0bd398638defff6c06e99f548781-config\") pod \"kube-apiserver-proxy-ip-10-0-141-176.ec2.internal\" (UID: \"829c0bd398638defff6c06e99f548781\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.536451 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.536365 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.636806 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.636776 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.640921 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.640901 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.645930 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.645912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:13.737691 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.737651 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.838146 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.838068 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.938588 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:13.938539 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:13.948269 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.948242 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.953332 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.953317 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:32:13.953449 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.953434 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:13.953508 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.953450 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:13.953508 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:13.953456 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:14.033001 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.032954 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:27:13 +0000 UTC" deadline="2027-10-13 02:06:42.066201293 +0000 UTC" Apr 23 13:32:14.033001 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.032990 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12900h34m28.033214093s" Apr 23 13:32:14.039121 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:14.039095 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:14.043265 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.043245 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:14.066718 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.066682 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:14.130089 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.130053 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pcd4j" Apr 23 13:32:14.141026 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:14.139661 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:14.141188 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.141042 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pcd4j" Apr 23 13:32:14.167491 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:14.167457 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c43c6c4a8456b6c709608d0d7c51f49.slice/crio-241c35ae54495bcf83609a0a1e4659fd900254fb6d6dafd6afd116abc6d024f7 WatchSource:0}: Error finding container 241c35ae54495bcf83609a0a1e4659fd900254fb6d6dafd6afd116abc6d024f7: Status 404 returned error can't find the container with id 241c35ae54495bcf83609a0a1e4659fd900254fb6d6dafd6afd116abc6d024f7 Apr 23 13:32:14.168241 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:14.168213 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod829c0bd398638defff6c06e99f548781.slice/crio-6ed515bec1a95dade509686d9e223c81fb0101211455815ef33763ff3880b316 WatchSource:0}: Error finding container 6ed515bec1a95dade509686d9e223c81fb0101211455815ef33763ff3880b316: Status 404 returned error can't find the container with id 6ed515bec1a95dade509686d9e223c81fb0101211455815ef33763ff3880b316 Apr 23 13:32:14.173104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.173085 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:32:14.207906 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.207850 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" event={"ID":"829c0bd398638defff6c06e99f548781","Type":"ContainerStarted","Data":"6ed515bec1a95dade509686d9e223c81fb0101211455815ef33763ff3880b316"} Apr 23 13:32:14.208755 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.208726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" event={"ID":"0c43c6c4a8456b6c709608d0d7c51f49","Type":"ContainerStarted","Data":"241c35ae54495bcf83609a0a1e4659fd900254fb6d6dafd6afd116abc6d024f7"} Apr 23 13:32:14.239931 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:14.239889 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:14.340417 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:14.340368 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-176.ec2.internal\" not found" Apr 23 13:32:14.386140 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.386086 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:14.444477 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.444453 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" Apr 23 13:32:14.452152 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.452132 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:14.453348 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.453336 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" Apr 23 13:32:14.464150 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.464131 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:14.787305 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.787213 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:14.823434 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:14.823402 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:15.022063 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.022025 2567 apiserver.go:52] "Watching apiserver" Apr 23 13:32:15.031272 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.031243 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:32:15.031768 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.031732 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-x9nzd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal","openshift-multus/multus-additional-cni-plugins-w2589","openshift-network-operator/iptables-alerter-hkkq2","openshift-ovn-kubernetes/ovnkube-node-kqqbw","kube-system/global-pull-secret-syncer-p54pj","kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt","openshift-dns/node-resolver-4nhld","openshift-image-registry/node-ca-5lgwg","openshift-multus/multus-26s76","openshift-multus/network-metrics-daemon-c246c","openshift-network-diagnostics/network-check-target-75h8j","kube-system/konnectivity-agent-58dnr"] Apr 23 13:32:15.035712 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.035689 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.037199 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.037178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:15.037344 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.037262 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:15.038579 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.038558 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.038688 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.038602 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.039115 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.039095 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:32:15.041500 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.041482 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.041606 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.041514 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.042745 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.042728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:32:15.042831 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.042728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:32:15.042831 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.042735 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.043049 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.043033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.043514 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.043453 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.043588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.043572 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-svnt7\"" Apr 23 13:32:15.043637 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.043624 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:32:15.044267 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.043718 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.044267 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044264 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.045129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044457 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.045129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044634 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2dsfl\"" Apr 23 13:32:15.045129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044660 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ft7qb\"" Apr 23 13:32:15.045129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044837 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:32:15.045129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.044893 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.046090 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.045982 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vkbp\"" Apr 23 13:32:15.048255 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.047135 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:32:15.048255 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.047267 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.048255 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.047366 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:15.048441 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.048311 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:32:15.048441 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.048369 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:32:15.048536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.048477 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:32:15.050218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.049108 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:32:15.050218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.049863 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.051951 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.050880 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.052967 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.052946 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcwq\" (UniqueName: \"kubernetes.io/projected/cc296f8c-211e-4e05-8959-a5aba129cc83-kube-api-access-tpcwq\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-systemd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-log-socket\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053141 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6811f33-1a89-4437-950c-bdb29fcbc2f5-hosts-file\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053157 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a642a633-45bc-405e-899d-b28d88699e93-host\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-system-cni-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053260 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-netns\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-node-log\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-netd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053441 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-bin\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.053729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26s76" Apr 23 13:32:15.054838 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.053479 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.055008 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.054847 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wk8bw\"" Apr 23 13:32:15.055101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055061 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bqtrf\"" Apr 23 13:32:15.055172 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.054867 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.055172 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055153 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.055316 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.054994 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:32:15.055316 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055260 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.055486 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-env-overrides\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.055626 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-script-lib\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.055761 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055735 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/226f8750-5ae4-4644-ac04-451c03fc015b-agent-certs\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.055868 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.055868 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055796 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-kubelet\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.055868 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/226f8750-5ae4-4644-ac04-451c03fc015b-konnectivity-ca\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.055868 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055842 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-registration-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.055868 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055865 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-device-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055889 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.056076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-slash\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.055980 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7kx\" (UniqueName: \"kubernetes.io/projected/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-kube-api-access-8b7kx\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056008 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056076 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d945d3c6-4041-4013-9864-0b2c325ebccb-host-slash\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-systemd-units\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-etc-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6811f33-1a89-4437-950c-bdb29fcbc2f5-tmp-dir\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a642a633-45bc-405e-899d-b28d88699e93-serviceca\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056178 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxv7\" (UniqueName: \"kubernetes.io/projected/a642a633-45bc-405e-899d-b28d88699e93-kube-api-access-vsxv7\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056197 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056220 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d945d3c6-4041-4013-9864-0b2c325ebccb-iptables-alerter-script\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056274 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056304 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056332 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqj2\" (UniqueName: \"kubernetes.io/projected/f6811f33-1a89-4437-950c-bdb29fcbc2f5-kube-api-access-4pqj2\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ks5g\" (UniqueName: \"kubernetes.io/projected/b31fd928-68b6-44d9-91ab-8a340c5c1073-kube-api-access-7ks5g\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ntr\" (UniqueName: \"kubernetes.io/projected/d945d3c6-4041-4013-9864-0b2c325ebccb-kube-api-access-j8ntr\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056422 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-var-lib-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056445 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-ovn\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-config\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056521 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovn-node-metrics-cert\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056554 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-socket-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056583 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-etc-selinux\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-sys-fs\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056655 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-cnibin\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.056998 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.056690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-os-release\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.057522 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.057205 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.057607 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.057585 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.057767 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.057650 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:15.057968 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.057945 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.058338 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.058314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:32:15.058427 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.058380 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6qz7\"" Apr 23 13:32:15.058490 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.058314 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5r6gd\"" Apr 23 13:32:15.058592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.058320 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:15.059153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.059136 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:15.059404 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.059344 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnpgw\"" Apr 23 13:32:15.059636 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.059619 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:32:15.142146 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.142115 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:14 +0000 UTC" deadline="2027-09-20 09:15:13.76915084 +0000 UTC" Apr 23 13:32:15.142146 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.142141 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12355h42m58.627013021s" Apr 23 13:32:15.144949 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.144922 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:32:15.156915 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.156885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqj2\" (UniqueName: \"kubernetes.io/projected/f6811f33-1a89-4437-950c-bdb29fcbc2f5-kube-api-access-4pqj2\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.156925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ks5g\" (UniqueName: \"kubernetes.io/projected/b31fd928-68b6-44d9-91ab-8a340c5c1073-kube-api-access-7ks5g\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.156945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ntr\" (UniqueName: \"kubernetes.io/projected/d945d3c6-4041-4013-9864-0b2c325ebccb-kube-api-access-j8ntr\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.156968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-var-lib-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.156984 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-config\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-socket-dir-parent\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157038 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.157062 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157065 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/226f8750-5ae4-4644-ac04-451c03fc015b-konnectivity-ca\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157113 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-var-lib-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-etc-selinux\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-cnibin\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157207 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-os-release\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157259 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-etc-selinux\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157305 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-system-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.157442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-cnibin\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157445 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-cni-binary-copy\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157499 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-os-release\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157499 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-kubelet\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157546 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6811f33-1a89-4437-950c-bdb29fcbc2f5-hosts-file\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-netns\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157660 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6811f33-1a89-4437-950c-bdb29fcbc2f5-hosts-file\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-netd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-run-netns\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157725 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-config\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157800 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-sys\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.157814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157816 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-netd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/226f8750-5ae4-4644-ac04-451c03fc015b-konnectivity-ca\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-lib-modules\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-bin\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157901 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-host\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-hostroot\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157936 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-cni-bin\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157952 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157990 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.157994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-dbus\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158054 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-device-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-slash\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158088 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7kx\" (UniqueName: \"kubernetes.io/projected/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-kube-api-access-8b7kx\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158112 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-kubernetes\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158150 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-slash\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.158335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-device-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158203 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmx6b\" (UniqueName: \"kubernetes.io/projected/92424b25-21d9-42cb-aca4-cad86a5e3dad-kube-api-access-jmx6b\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158300 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-os-release\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-etc-kubernetes\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158361 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzmfj\" (UniqueName: \"kubernetes.io/projected/69a4531d-2959-43b5-929f-9d7ddf10163b-kube-api-access-bzmfj\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158370 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-ovn\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovn-node-metrics-cert\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158442 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-run\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158456 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-tmp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158453 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-ovn\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158517 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-bin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158568 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-socket-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-sys-fs\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.159099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcwq\" (UniqueName: \"kubernetes.io/projected/cc296f8c-211e-4e05-8959-a5aba129cc83-kube-api-access-tpcwq\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-systemd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-log-socket\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158702 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-run-systemd\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-sys-fs\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-netns\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-log-socket\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-socket-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158860 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a642a633-45bc-405e-899d-b28d88699e93-host\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158908 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a642a633-45bc-405e-899d-b28d88699e93-host\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-system-cni-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-node-log\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.158981 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159006 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-script-lib\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysconfig\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159082 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-cnibin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-multus\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-env-overrides\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.159814 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159170 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/226f8750-5ae4-4644-ac04-451c03fc015b-agent-certs\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159568 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-system-cni-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159643 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-node-log\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159876 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-modprobe-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-multus-certs\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.159992 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-kubelet\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-conf\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160096 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-var-lib-kubelet\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160159 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-tuned\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc296f8c-211e-4e05-8959-a5aba129cc83-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-host-kubelet\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-registration-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160439 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhjp\" (UniqueName: \"kubernetes.io/projected/05de8049-2bce-4d70-bdf3-a72ee2c57e37-kube-api-access-kfhjp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160501 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-registration-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160559 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovnkube-script-lib\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.160592 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160571 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-daemon-config\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160609 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-kubelet-config\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d945d3c6-4041-4013-9864-0b2c325ebccb-host-slash\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b31fd928-68b6-44d9-91ab-8a340c5c1073-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-systemd-units\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160835 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-systemd-units\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160910 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d945d3c6-4041-4013-9864-0b2c325ebccb-host-slash\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160960 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-etc-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.160995 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161041 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6811f33-1a89-4437-950c-bdb29fcbc2f5-tmp-dir\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a642a633-45bc-405e-899d-b28d88699e93-serviceca\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-etc-openvswitch\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-env-overrides\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161258 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-cni-binary-copy\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.161346 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxv7\" (UniqueName: \"kubernetes.io/projected/a642a633-45bc-405e-899d-b28d88699e93-kube-api-access-vsxv7\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161373 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6811f33-1a89-4437-950c-bdb29fcbc2f5-tmp-dir\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d945d3c6-4041-4013-9864-0b2c325ebccb-iptables-alerter-script\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161532 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-systemd\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-k8s-cni-cncf-io\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a642a633-45bc-405e-899d-b28d88699e93-serviceca\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-conf-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.162031 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.161880 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc296f8c-211e-4e05-8959-a5aba129cc83-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.162379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.162321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d945d3c6-4041-4013-9864-0b2c325ebccb-iptables-alerter-script\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.164603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.164190 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/226f8750-5ae4-4644-ac04-451c03fc015b-agent-certs\") pod \"konnectivity-agent-58dnr\" (UID: \"226f8750-5ae4-4644-ac04-451c03fc015b\") " pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.164603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.164282 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-ovn-node-metrics-cert\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.167416 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.167392 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:15.167416 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.167419 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:15.167583 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.167433 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.167583 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.167495 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.667473171 +0000 UTC m=+3.050002301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.168890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.168867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqj2\" (UniqueName: \"kubernetes.io/projected/f6811f33-1a89-4437-950c-bdb29fcbc2f5-kube-api-access-4pqj2\") pod \"node-resolver-4nhld\" (UID: \"f6811f33-1a89-4437-950c-bdb29fcbc2f5\") " pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.169385 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.169363 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ks5g\" (UniqueName: \"kubernetes.io/projected/b31fd928-68b6-44d9-91ab-8a340c5c1073-kube-api-access-7ks5g\") pod \"aws-ebs-csi-driver-node-xg7tt\" (UID: \"b31fd928-68b6-44d9-91ab-8a340c5c1073\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.169560 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.169535 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ntr\" (UniqueName: \"kubernetes.io/projected/d945d3c6-4041-4013-9864-0b2c325ebccb-kube-api-access-j8ntr\") pod \"iptables-alerter-hkkq2\" (UID: \"d945d3c6-4041-4013-9864-0b2c325ebccb\") " pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.170733 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.170701 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcwq\" (UniqueName: \"kubernetes.io/projected/cc296f8c-211e-4e05-8959-a5aba129cc83-kube-api-access-tpcwq\") pod \"multus-additional-cni-plugins-w2589\" (UID: \"cc296f8c-211e-4e05-8959-a5aba129cc83\") " pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.171488 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.171467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7kx\" (UniqueName: \"kubernetes.io/projected/b5ea6fad-b66f-4dc4-b956-3f7e7185d225-kube-api-access-8b7kx\") pod \"ovnkube-node-kqqbw\" (UID: \"b5ea6fad-b66f-4dc4-b956-3f7e7185d225\") " pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.172138 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.172115 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxv7\" (UniqueName: \"kubernetes.io/projected/a642a633-45bc-405e-899d-b28d88699e93-kube-api-access-vsxv7\") pod \"node-ca-5lgwg\" (UID: \"a642a633-45bc-405e-899d-b28d88699e93\") " pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.262611 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262577 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-kubernetes\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.262797 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.262797 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmx6b\" (UniqueName: \"kubernetes.io/projected/92424b25-21d9-42cb-aca4-cad86a5e3dad-kube-api-access-jmx6b\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.262797 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262739 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-kubernetes\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.262960 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262889 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.262960 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.262948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-os-release\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263056 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263005 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-etc-kubernetes\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263056 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-os-release\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263161 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzmfj\" (UniqueName: \"kubernetes.io/projected/69a4531d-2959-43b5-929f-9d7ddf10163b-kube-api-access-bzmfj\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.263161 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263097 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-etc-kubernetes\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263161 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-run\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263161 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-tmp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263164 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263190 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-bin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263216 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-run\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263221 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-netns\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-bin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263292 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263292 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysconfig\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263337 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysconfig\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263353 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-netns\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263363 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-cnibin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-multus\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-modprobe-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263438 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-cnibin\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263457 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-multus-certs\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263461 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-cni-multus\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-conf\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263509 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-var-lib-kubelet\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-multus-certs\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-tuned\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhjp\" (UniqueName: \"kubernetes.io/projected/05de8049-2bce-4d70-bdf3-a72ee2c57e37-kube-api-access-kfhjp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-var-lib-kubelet\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-modprobe-d\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-daemon-config\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263631 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-kubelet-config\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-sysctl-conf\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263675 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-systemd\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.263722 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263706 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-kubelet-config\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-systemd\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-k8s-cni-cncf-io\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-conf-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263832 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-run-k8s-cni-cncf-io\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-socket-dir-parent\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263906 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-socket-dir-parent\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263897 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-conf-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.263981 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-system-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264032 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-system-cni-dir\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.264039 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-cni-binary-copy\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.264123 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.764103524 +0000 UTC m=+3.146632633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264160 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-kubelet\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264193 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-sys\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264252 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-lib-modules\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-host\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.264485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264302 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-hostroot\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264330 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-dbus\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264362 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-sys\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-host\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.264457 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-host-var-lib-kubelet\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/92424b25-21d9-42cb-aca4-cad86a5e3dad-hostroot\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.264523 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.764509011 +0000 UTC m=+3.147038122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264574 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/33d32343-1781-48b5-bdcd-a04d2dec36da-dbus\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264589 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05de8049-2bce-4d70-bdf3-a72ee2c57e37-lib-modules\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-multus-daemon-config\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.265092 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.264952 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92424b25-21d9-42cb-aca4-cad86a5e3dad-cni-binary-copy\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.265738 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.265715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-tmp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.265890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.265874 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/05de8049-2bce-4d70-bdf3-a72ee2c57e37-etc-tuned\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.280167 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.280131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmx6b\" (UniqueName: \"kubernetes.io/projected/92424b25-21d9-42cb-aca4-cad86a5e3dad-kube-api-access-jmx6b\") pod \"multus-26s76\" (UID: \"92424b25-21d9-42cb-aca4-cad86a5e3dad\") " pod="openshift-multus/multus-26s76" Apr 23 13:32:15.282048 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.282028 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhjp\" (UniqueName: \"kubernetes.io/projected/05de8049-2bce-4d70-bdf3-a72ee2c57e37-kube-api-access-kfhjp\") pod \"tuned-x9nzd\" (UID: \"05de8049-2bce-4d70-bdf3-a72ee2c57e37\") " pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.282207 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.282187 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzmfj\" (UniqueName: \"kubernetes.io/projected/69a4531d-2959-43b5-929f-9d7ddf10163b-kube-api-access-bzmfj\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.350608 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.350531 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5lgwg" Apr 23 13:32:15.362274 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.362247 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w2589" Apr 23 13:32:15.370974 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.370946 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hkkq2" Apr 23 13:32:15.375594 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.375576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:15.381167 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.381149 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:15.387339 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.387320 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4nhld" Apr 23 13:32:15.393964 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.393946 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" Apr 23 13:32:15.400538 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.400519 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" Apr 23 13:32:15.405093 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.405071 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-26s76" Apr 23 13:32:15.767818 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.767778 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.767828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:15.767853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.767886 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.767956 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:16.767937213 +0000 UTC m=+4.150466345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.767975 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.767989 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:15.768008 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.768005 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:15.768287 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.768017 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.768287 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.768025 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:16.768010018 +0000 UTC m=+4.150539146 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:15.768287 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:15.768057 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:16.768045381 +0000 UTC m=+4.150574488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.895244 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.895194 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc296f8c_211e_4e05_8959_a5aba129cc83.slice/crio-7f49c24414f85f0b486ae820a902ea6e3f12d69b66a42aa6b96315cf1e1b6735 WatchSource:0}: Error finding container 7f49c24414f85f0b486ae820a902ea6e3f12d69b66a42aa6b96315cf1e1b6735: Status 404 returned error can't find the container with id 7f49c24414f85f0b486ae820a902ea6e3f12d69b66a42aa6b96315cf1e1b6735 Apr 23 13:32:15.901989 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.901958 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31fd928_68b6_44d9_91ab_8a340c5c1073.slice/crio-d27d25180a547d7b356f644b01c8e54f8cc06cb30e66fd2e0281b6b456b7f552 WatchSource:0}: Error finding container d27d25180a547d7b356f644b01c8e54f8cc06cb30e66fd2e0281b6b456b7f552: Status 404 returned error can't find the container with id d27d25180a547d7b356f644b01c8e54f8cc06cb30e66fd2e0281b6b456b7f552 Apr 23 13:32:15.903593 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.903561 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92424b25_21d9_42cb_aca4_cad86a5e3dad.slice/crio-5e04f27597887fdae5e4378900ccf2c23a9bdce8f65edb3c381f468fc95d0ed9 WatchSource:0}: Error finding container 5e04f27597887fdae5e4378900ccf2c23a9bdce8f65edb3c381f468fc95d0ed9: Status 404 returned error can't find the container with id 5e04f27597887fdae5e4378900ccf2c23a9bdce8f65edb3c381f468fc95d0ed9 Apr 23 13:32:15.903945 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.903915 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda642a633_45bc_405e_899d_b28d88699e93.slice/crio-330981ed40591ffc3266b8812c7f0682d6060ddfed8b80149a1184e2062294bb WatchSource:0}: Error finding container 330981ed40591ffc3266b8812c7f0682d6060ddfed8b80149a1184e2062294bb: Status 404 returned error can't find the container with id 330981ed40591ffc3266b8812c7f0682d6060ddfed8b80149a1184e2062294bb Apr 23 13:32:15.904465 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.904439 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ea6fad_b66f_4dc4_b956_3f7e7185d225.slice/crio-cf0b94992418c1685f20f3255b269c7e7c3f0b35c66c2c0fb8b2c56bc8065b83 WatchSource:0}: Error finding container cf0b94992418c1685f20f3255b269c7e7c3f0b35c66c2c0fb8b2c56bc8065b83: Status 404 returned error can't find the container with id cf0b94992418c1685f20f3255b269c7e7c3f0b35c66c2c0fb8b2c56bc8065b83 Apr 23 13:32:15.905406 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.905387 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05de8049_2bce_4d70_bdf3_a72ee2c57e37.slice/crio-d2644fb55e05be8005e73bad99a548ecff8a0e1f9dd210fd85f382b91a235879 WatchSource:0}: Error finding container d2644fb55e05be8005e73bad99a548ecff8a0e1f9dd210fd85f382b91a235879: Status 404 returned error can't find the container with id d2644fb55e05be8005e73bad99a548ecff8a0e1f9dd210fd85f382b91a235879 Apr 23 13:32:15.906687 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.906660 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6811f33_1a89_4437_950c_bdb29fcbc2f5.slice/crio-7c95e6009295dc05e01a290b68446a7270860a2226d3f208f63cd05f3dad7809 WatchSource:0}: Error finding container 7c95e6009295dc05e01a290b68446a7270860a2226d3f208f63cd05f3dad7809: Status 404 returned error can't find the container with id 7c95e6009295dc05e01a290b68446a7270860a2226d3f208f63cd05f3dad7809 Apr 23 13:32:15.908367 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:15.908345 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226f8750_5ae4_4644_ac04_451c03fc015b.slice/crio-0c9d71e2bd1a56a0527774d12162ad1a2bdca745cf3b56517cf94c443f313d3e WatchSource:0}: Error finding container 0c9d71e2bd1a56a0527774d12162ad1a2bdca745cf3b56517cf94c443f313d3e: Status 404 returned error can't find the container with id 0c9d71e2bd1a56a0527774d12162ad1a2bdca745cf3b56517cf94c443f313d3e Apr 23 13:32:16.143053 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.143012 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:14 +0000 UTC" deadline="2027-12-11 19:21:44.801852244 +0000 UTC" Apr 23 13:32:16.143053 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.143046 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14333h49m28.658809575s" Apr 23 13:32:16.179853 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.179712 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:16.212938 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.212895 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"cf0b94992418c1685f20f3255b269c7e7c3f0b35c66c2c0fb8b2c56bc8065b83"} Apr 23 13:32:16.215143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.215112 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" event={"ID":"829c0bd398638defff6c06e99f548781","Type":"ContainerStarted","Data":"7fcdfbbcfd070ec03747b723df15a0322d83fb8ca6163b3fde8825411753484b"} Apr 23 13:32:16.216690 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.216660 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58dnr" event={"ID":"226f8750-5ae4-4644-ac04-451c03fc015b","Type":"ContainerStarted","Data":"0c9d71e2bd1a56a0527774d12162ad1a2bdca745cf3b56517cf94c443f313d3e"} Apr 23 13:32:16.218349 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.218323 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" event={"ID":"05de8049-2bce-4d70-bdf3-a72ee2c57e37","Type":"ContainerStarted","Data":"d2644fb55e05be8005e73bad99a548ecff8a0e1f9dd210fd85f382b91a235879"} Apr 23 13:32:16.219869 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.219841 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26s76" event={"ID":"92424b25-21d9-42cb-aca4-cad86a5e3dad","Type":"ContainerStarted","Data":"5e04f27597887fdae5e4378900ccf2c23a9bdce8f65edb3c381f468fc95d0ed9"} Apr 23 13:32:16.221074 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.221046 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" event={"ID":"b31fd928-68b6-44d9-91ab-8a340c5c1073","Type":"ContainerStarted","Data":"d27d25180a547d7b356f644b01c8e54f8cc06cb30e66fd2e0281b6b456b7f552"} Apr 23 13:32:16.222164 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.222126 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hkkq2" event={"ID":"d945d3c6-4041-4013-9864-0b2c325ebccb","Type":"ContainerStarted","Data":"21365cd131a8dffc26507f7ed5769ced18121b7f4a21b8b788cbdf6ec7e76767"} Apr 23 13:32:16.223282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.223218 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerStarted","Data":"7f49c24414f85f0b486ae820a902ea6e3f12d69b66a42aa6b96315cf1e1b6735"} Apr 23 13:32:16.224377 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.224349 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4nhld" event={"ID":"f6811f33-1a89-4437-950c-bdb29fcbc2f5","Type":"ContainerStarted","Data":"7c95e6009295dc05e01a290b68446a7270860a2226d3f208f63cd05f3dad7809"} Apr 23 13:32:16.225791 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.225650 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lgwg" event={"ID":"a642a633-45bc-405e-899d-b28d88699e93","Type":"ContainerStarted","Data":"330981ed40591ffc3266b8812c7f0682d6060ddfed8b80149a1184e2062294bb"} Apr 23 13:32:16.776108 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.776068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:16.776296 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.776130 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:16.776296 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:16.776163 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:16.776427 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776339 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:16.776427 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776359 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:16.776427 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776373 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:16.776569 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776431 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.776412991 +0000 UTC m=+6.158942113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:16.776852 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776834 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:16.776923 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776886 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.776871186 +0000 UTC m=+6.159400302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:16.776987 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776947 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:16.776987 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:16.776980 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.776969546 +0000 UTC m=+6.159498657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.208010 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:17.208149 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.208565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:17.208664 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.208738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:17.208901 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:17.208808 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:17.231544 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.231509 2567 generic.go:358] "Generic (PLEG): container finished" podID="0c43c6c4a8456b6c709608d0d7c51f49" containerID="28acc585b6aff3e5486bc847ff36268e6bb2b21f7463ce51e230e60b810a6e56" exitCode=0 Apr 23 13:32:17.232303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.232279 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" event={"ID":"0c43c6c4a8456b6c709608d0d7c51f49","Type":"ContainerDied","Data":"28acc585b6aff3e5486bc847ff36268e6bb2b21f7463ce51e230e60b810a6e56"} Apr 23 13:32:17.258113 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:17.258049 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-176.ec2.internal" podStartSLOduration=3.258030902 podStartE2EDuration="3.258030902s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:16.232478893 +0000 UTC m=+3.615008027" watchObservedRunningTime="2026-04-23 13:32:17.258030902 +0000 UTC m=+4.640560034" Apr 23 13:32:18.240476 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:18.240437 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" event={"ID":"0c43c6c4a8456b6c709608d0d7c51f49","Type":"ContainerStarted","Data":"0d368460bc8c7766ca61c9c6f74a9da16acf676ecc7b6983cc507d914fd8dbda"} Apr 23 13:32:18.257075 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:18.256097 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-176.ec2.internal" podStartSLOduration=4.256076778 podStartE2EDuration="4.256076778s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:18.256016637 +0000 UTC m=+5.638545768" watchObservedRunningTime="2026-04-23 13:32:18.256076778 +0000 UTC m=+5.638605912" Apr 23 13:32:18.793413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:18.793369 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:18.793584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:18.793465 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:18.793584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:18.793507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:18.793715 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.793632 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:18.793715 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.793698 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.793679897 +0000 UTC m=+10.176209028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:18.794122 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794099 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:18.794209 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794129 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:18.794209 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794141 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:18.794209 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794188 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.794172754 +0000 UTC m=+10.176701867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:18.794399 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794272 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:18.794399 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:18.794313 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.794299588 +0000 UTC m=+10.176828700 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:19.205398 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:19.205367 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:19.205602 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:19.205500 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:19.206114 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:19.205367 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:19.206114 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:19.205861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:19.206114 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:19.205943 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:19.206114 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:19.206027 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:21.211378 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:21.211409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:21.211499 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:21.211589 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:21.211677 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:21.211788 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:21.211743 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:22.830152 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:22.830113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:22.830173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:22.830206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830273 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830338 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.830320975 +0000 UTC m=+18.212850086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830339 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830357 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830368 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830367 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830400 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.830390361 +0000 UTC m=+18.212919482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:22.830560 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:22.830466 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.83044613 +0000 UTC m=+18.212975257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:23.206697 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:23.206117 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:23.206697 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:23.206248 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:23.206697 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:23.206599 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:23.206984 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:23.206711 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:23.206984 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:23.206754 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:23.206984 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:23.206822 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:25.205932 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:25.205846 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:25.205932 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:25.205872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:25.205932 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:25.205888 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:25.206497 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:25.205976 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:25.206497 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:25.206467 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:25.206578 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:25.206549 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:27.205577 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:27.205540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:27.206111 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:27.205540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:27.206111 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:27.205673 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:27.206111 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:27.205540 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:27.206111 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:27.205784 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:27.206111 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:27.205852 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:29.205528 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:29.205465 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:29.205968 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:29.205465 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:29.205968 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:29.205626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:29.205968 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:29.205626 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:29.205968 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:29.205750 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:29.205968 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:29.205848 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:30.888921 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:30.888884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:30.888952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:30.888986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889057 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889073 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889083 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889117 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889139 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.889121491 +0000 UTC m=+34.271650616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889153 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889184 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.889164501 +0000 UTC m=+34.271693613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:30.889358 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:30.889202 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret podName:33d32343-1781-48b5-bdcd-a04d2dec36da nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.889192083 +0000 UTC m=+34.271721192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret") pod "global-pull-secret-syncer-p54pj" (UID: "33d32343-1781-48b5-bdcd-a04d2dec36da") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:32:31.205881 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:31.205793 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:31.206037 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:31.205792 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:31.206037 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:31.205942 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:31.206037 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:31.205792 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:31.206037 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:31.206023 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:31.206248 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:31.206066 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.206807 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:33.206940 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.207462 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:33.207571 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.207720 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:33.208005 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:33.207806 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:33.274583 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.274336 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" event={"ID":"05de8049-2bce-4d70-bdf3-a72ee2c57e37","Type":"ContainerStarted","Data":"0ca39caa72674703012f096d3977120ff73bbafb4f12bdfa692b54430e877cbd"} Apr 23 13:32:33.283944 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.283913 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-26s76" event={"ID":"92424b25-21d9-42cb-aca4-cad86a5e3dad","Type":"ContainerStarted","Data":"66dfc5f649096a93c4b57cc87fa5ecc6d7a40c7c4bdbd947b044a462f205c56c"} Apr 23 13:32:33.297132 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:33.297076 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x9nzd" podStartSLOduration=3.198292044 podStartE2EDuration="20.297058121s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.907292923 +0000 UTC m=+3.289822041" lastFinishedPulling="2026-04-23 13:32:33.006058996 +0000 UTC m=+20.388588118" observedRunningTime="2026-04-23 13:32:33.295199583 +0000 UTC m=+20.677728713" watchObservedRunningTime="2026-04-23 13:32:33.297058121 +0000 UTC m=+20.679587253" Apr 23 13:32:34.200683 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.200527 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:34.287497 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.287462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4nhld" event={"ID":"f6811f33-1a89-4437-950c-bdb29fcbc2f5","Type":"ContainerStarted","Data":"c4cd790f252ecd4544dd2ac0fbbe1a820ac9f465839c4e6d3817484fc39d0f33"} Apr 23 13:32:34.288813 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.288788 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5lgwg" event={"ID":"a642a633-45bc-405e-899d-b28d88699e93","Type":"ContainerStarted","Data":"d1c36591d73cdd393eef21d6cf524de729e5ea198a56f76077b83ed2a601ec51"} Apr 23 13:32:34.290980 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.290961 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:32:34.291303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291284 2567 generic.go:358] "Generic (PLEG): container finished" podID="b5ea6fad-b66f-4dc4-b956-3f7e7185d225" containerID="e5e68ca0975c61c7a250c88710b29dc651132902f99319416af14649f38df13b" exitCode=1 Apr 23 13:32:34.291379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"5013885dada6607aee4258ba6d937ef553821790703b9e2061a2fb15e5672130"} Apr 23 13:32:34.291379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291358 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"f82209863465c490b6561387326f32f241ead32641f80ef58fe23b0d551c9972"} Apr 23 13:32:34.291379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291372 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"c55fa17a6fdb7cd730e926994e82f99361492b777e0f3d6d833fc6baecf9955d"} Apr 23 13:32:34.291502 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291382 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"98dbadee411862278cd4a0b2525f0fc352668487b355bd2526c57d3767fdd1b3"} Apr 23 13:32:34.291502 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291396 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"07326802522adb4f44d6a4ca7435fd47a54e5713ddc05e76dcb9435ece6425ee"} Apr 23 13:32:34.291502 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.291404 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerDied","Data":"e5e68ca0975c61c7a250c88710b29dc651132902f99319416af14649f38df13b"} Apr 23 13:32:34.292574 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.292550 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-58dnr" event={"ID":"226f8750-5ae4-4644-ac04-451c03fc015b","Type":"ContainerStarted","Data":"2a981f570ef0bd29d6d4ecf7db2a35bc0dc6a7dcfb7b60ea137fa91ba6198677"} Apr 23 13:32:34.294023 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.293995 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" event={"ID":"b31fd928-68b6-44d9-91ab-8a340c5c1073","Type":"ContainerStarted","Data":"75f7f09bed7caae4096c5e1eece4d100bee676da9423d7304098a16933fedc1c"} Apr 23 13:32:34.294023 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.294020 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" event={"ID":"b31fd928-68b6-44d9-91ab-8a340c5c1073","Type":"ContainerStarted","Data":"9556437d8e91460b77af144c3398773b0501ac6658538150cd9663ff9a0e719f"} Apr 23 13:32:34.295078 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.295044 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hkkq2" event={"ID":"d945d3c6-4041-4013-9864-0b2c325ebccb","Type":"ContainerStarted","Data":"193930e201400da9eaa6fe13f1b718bc21c6cd349c2cebc91d77be8ad969371d"} Apr 23 13:32:34.296349 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.296330 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="7b3c1e5ff0a00ed8e41aebf6b96ba283113ecd180d862d27a2f19626e78db609" exitCode=0 Apr 23 13:32:34.296432 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.296415 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"7b3c1e5ff0a00ed8e41aebf6b96ba283113ecd180d862d27a2f19626e78db609"} Apr 23 13:32:34.307870 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.307829 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4nhld" podStartSLOduration=4.166003165 podStartE2EDuration="21.307817513s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.909142005 +0000 UTC m=+3.291671120" lastFinishedPulling="2026-04-23 13:32:33.050956346 +0000 UTC m=+20.433485468" observedRunningTime="2026-04-23 13:32:34.307361011 +0000 UTC m=+21.689890141" watchObservedRunningTime="2026-04-23 13:32:34.307817513 +0000 UTC m=+21.690346642" Apr 23 13:32:34.307990 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.307926 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-26s76" podStartSLOduration=4.137523208 podStartE2EDuration="21.307921997s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.905982674 +0000 UTC m=+3.288511782" lastFinishedPulling="2026-04-23 13:32:33.076381459 +0000 UTC m=+20.458910571" observedRunningTime="2026-04-23 13:32:33.313204225 +0000 UTC m=+20.695733355" watchObservedRunningTime="2026-04-23 13:32:34.307921997 +0000 UTC m=+21.690451154" Apr 23 13:32:34.369900 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.369860 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hkkq2" podStartSLOduration=4.264528026 podStartE2EDuration="21.369846615s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.900755187 +0000 UTC m=+3.283284301" lastFinishedPulling="2026-04-23 13:32:33.006073767 +0000 UTC m=+20.388602890" observedRunningTime="2026-04-23 13:32:34.349270204 +0000 UTC m=+21.731799333" watchObservedRunningTime="2026-04-23 13:32:34.369846615 +0000 UTC m=+21.752375744" Apr 23 13:32:34.370391 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.370368 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-58dnr" podStartSLOduration=4.273944662 podStartE2EDuration="21.370360293s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.909656643 +0000 UTC m=+3.292185751" lastFinishedPulling="2026-04-23 13:32:33.006072274 +0000 UTC m=+20.388601382" observedRunningTime="2026-04-23 13:32:34.369566813 +0000 UTC m=+21.752095943" watchObservedRunningTime="2026-04-23 13:32:34.370360293 +0000 UTC m=+21.752889422" Apr 23 13:32:34.390828 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:34.390787 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5lgwg" podStartSLOduration=4.290563771 podStartE2EDuration="21.390774136s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.905836906 +0000 UTC m=+3.288366029" lastFinishedPulling="2026-04-23 13:32:33.006047272 +0000 UTC m=+20.388576394" observedRunningTime="2026-04-23 13:32:34.390296503 +0000 UTC m=+21.772825632" watchObservedRunningTime="2026-04-23 13:32:34.390774136 +0000 UTC m=+21.773303265" Apr 23 13:32:35.124060 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.123962 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:34.200679893Z","UUID":"f2436bb1-9fc1-495f-adec-693764b125b8","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:35.127112 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.127089 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:35.127112 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.127118 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:35.206101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.206007 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:35.206101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.206043 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:35.206101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.206058 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:35.206392 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:35.206169 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:35.206392 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:35.206288 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:35.206392 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:35.206344 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:35.300279 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.300245 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" event={"ID":"b31fd928-68b6-44d9-91ab-8a340c5c1073","Type":"ContainerStarted","Data":"6ad932465830fcb008c84ac701c9075a929791fae21e2c9cd9c56bf18de3ea39"} Apr 23 13:32:35.319407 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.319357 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xg7tt" podStartSLOduration=3.303948473 podStartE2EDuration="22.319339534s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.903830593 +0000 UTC m=+3.286359716" lastFinishedPulling="2026-04-23 13:32:34.919221655 +0000 UTC m=+22.301750777" observedRunningTime="2026-04-23 13:32:35.319268444 +0000 UTC m=+22.701797575" watchObservedRunningTime="2026-04-23 13:32:35.319339534 +0000 UTC m=+22.701868665" Apr 23 13:32:35.666456 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:35.666420 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:36.305637 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:36.305370 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:32:36.306088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:36.305988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"2f90c1af06394e1f1947bafa2e752d61cc80e068b28f89120fcbec1d460d8857"} Apr 23 13:32:37.205654 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:37.205613 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:37.205877 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:37.205619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:37.205877 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:37.205731 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:37.205877 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:37.205619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:37.205877 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:37.205827 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:37.206061 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:37.205921 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:38.312699 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.312576 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:32:38.313266 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.313083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"faf2ff1d147a476cfe9736edcf82d758acc2b6bc97481b30569da974631872e3"} Apr 23 13:32:38.314796 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.314754 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:38.314796 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.314795 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:38.314932 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.314815 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:38.314932 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.314793 2567 scope.go:117] "RemoveContainer" containerID="e5e68ca0975c61c7a250c88710b29dc651132902f99319416af14649f38df13b" Apr 23 13:32:38.331148 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.331121 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:38.332662 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.332640 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:32:38.776917 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.776824 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:38.777540 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:38.777522 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:39.205921 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.205887 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:39.206065 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.205892 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:39.206065 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:39.205995 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:39.206156 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:39.206094 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:39.206156 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.205900 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:39.206214 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:39.206204 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:39.317664 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.317634 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:32:39.318071 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.317975 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" event={"ID":"b5ea6fad-b66f-4dc4-b956-3f7e7185d225","Type":"ContainerStarted","Data":"cab4f79f553d5432b8f8986255e1be3ec9c3f6c4954b6c3c3df5aa9f033df2f9"} Apr 23 13:32:39.319593 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.319566 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="faadda7295620aa35542eee192659edc77971e035bcb917a9f74b76532d9b53f" exitCode=0 Apr 23 13:32:39.319699 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.319647 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"faadda7295620aa35542eee192659edc77971e035bcb917a9f74b76532d9b53f"} Apr 23 13:32:39.320650 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.320284 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-58dnr" Apr 23 13:32:39.361394 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:39.361343 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" podStartSLOduration=9.168199238 podStartE2EDuration="26.361329836s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.906209423 +0000 UTC m=+3.288738531" lastFinishedPulling="2026-04-23 13:32:33.09934 +0000 UTC m=+20.481869129" observedRunningTime="2026-04-23 13:32:39.359477602 +0000 UTC m=+26.742006733" watchObservedRunningTime="2026-04-23 13:32:39.361329836 +0000 UTC m=+26.743858966" Apr 23 13:32:40.181582 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.181552 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-75h8j"] Apr 23 13:32:40.181749 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.181665 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:40.181824 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:40.181777 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:40.186244 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.186201 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p54pj"] Apr 23 13:32:40.186375 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.186309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:40.186434 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:40.186390 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:40.186750 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.186730 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c246c"] Apr 23 13:32:40.186856 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:40.186817 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:40.186934 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:40.186911 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:41.206066 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:41.205874 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:41.206445 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:41.206164 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:41.325417 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:41.325386 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="1a0272349021fa1ca709d492a66e1c03183b0d1539202ef384ef9811a2164710" exitCode=0 Apr 23 13:32:41.325551 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:41.325440 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"1a0272349021fa1ca709d492a66e1c03183b0d1539202ef384ef9811a2164710"} Apr 23 13:32:42.205746 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:42.205678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:42.205882 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:42.205784 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:42.205882 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:42.205839 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:42.205950 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:42.205938 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:43.206942 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:43.206901 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:43.207648 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:43.207009 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:43.331096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:43.331054 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="16e09146efb6e921a26d264903841734983cb3489bf8ad6cffb300b198ac8912" exitCode=0 Apr 23 13:32:43.331096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:43.331101 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"16e09146efb6e921a26d264903841734983cb3489bf8ad6cffb300b198ac8912"} Apr 23 13:32:44.205511 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:44.205477 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:44.205708 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:44.205486 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:44.205708 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:44.205587 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p54pj" podUID="33d32343-1781-48b5-bdcd-a04d2dec36da" Apr 23 13:32:44.205708 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:44.205673 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:32:45.206209 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.206177 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:45.206672 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:45.206315 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-75h8j" podUID="a43bd7f9-1505-4e58-acda-ef8e398e302d" Apr 23 13:32:45.913073 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.913038 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-176.ec2.internal" event="NodeReady" Apr 23 13:32:45.913276 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.913182 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:45.962116 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.962079 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2hdsq"] Apr 23 13:32:45.979493 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.979462 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rhpd9"] Apr 23 13:32:45.979721 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.979681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:45.985027 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.984993 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:45.985282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.985264 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:32:45.985772 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.985754 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:45.998303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.998243 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2hdsq"] Apr 23 13:32:45.998303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.998290 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhpd9"] Apr 23 13:32:45.998303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:45.998294 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.002080 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.002051 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:46.002603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.002382 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:46.002603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.002391 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:32:46.002603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.002496 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:46.103809 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.103773 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4pbq\" (UniqueName: \"kubernetes.io/projected/441292d6-55dd-4164-b5f7-2bdf45288757-kube-api-access-n4pbq\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.103994 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.103823 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b86f405-4871-42b3-aa86-bde954086fa9-tmp-dir\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.103994 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.103873 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.103994 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.103902 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.104126 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.103997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m842m\" (UniqueName: \"kubernetes.io/projected/7b86f405-4871-42b3-aa86-bde954086fa9-kube-api-access-m842m\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.104126 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.104053 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b86f405-4871-42b3-aa86-bde954086fa9-config-volume\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.205380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205313 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:46.205380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205331 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:46.205627 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205320 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4pbq\" (UniqueName: \"kubernetes.io/projected/441292d6-55dd-4164-b5f7-2bdf45288757-kube-api-access-n4pbq\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.205627 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205476 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b86f405-4871-42b3-aa86-bde954086fa9-tmp-dir\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.205627 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205525 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.205627 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.205813 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.205719 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:46.205813 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.205718 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m842m\" (UniqueName: \"kubernetes.io/projected/7b86f405-4871-42b3-aa86-bde954086fa9-kube-api-access-m842m\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.206203 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.206176 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.706149247 +0000 UTC m=+34.088678374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:32:46.206359 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.206212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b86f405-4871-42b3-aa86-bde954086fa9-config-volume\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.206807 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.206455 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:46.206807 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.206591 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.706563164 +0000 UTC m=+34.089092286 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:32:46.209104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.207131 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b86f405-4871-42b3-aa86-bde954086fa9-config-volume\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.209104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.208298 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:46.209104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.208436 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:32:46.209104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.208305 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:32:46.215750 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.215726 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b86f405-4871-42b3-aa86-bde954086fa9-tmp-dir\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.215989 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.215972 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4pbq\" (UniqueName: \"kubernetes.io/projected/441292d6-55dd-4164-b5f7-2bdf45288757-kube-api-access-n4pbq\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.216035 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.215982 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m842m\" (UniqueName: \"kubernetes.io/projected/7b86f405-4871-42b3-aa86-bde954086fa9-kube-api-access-m842m\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.710563 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.710520 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:46.710563 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.710568 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:46.710831 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.710671 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:46.710831 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.710748 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:47.710732222 +0000 UTC m=+35.093261330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:32:46.710831 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.710751 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:46.710831 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.710801 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:47.710786959 +0000 UTC m=+35.093316067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:32:46.912291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.912253 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:32:46.912456 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.912300 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:46.912456 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.912350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:46.912456 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912409 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:46.912628 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912469 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:46.912628 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912487 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:46.912628 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912501 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zw4cx for pod openshift-network-diagnostics/network-check-target-75h8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:46.912628 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912487 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:33:18.912464829 +0000 UTC m=+66.294993943 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : secret "metrics-daemon-secret" not found Apr 23 13:32:46.912628 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:46.912573 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx podName:a43bd7f9-1505-4e58-acda-ef8e398e302d nodeName:}" failed. No retries permitted until 2026-04-23 13:33:18.912548311 +0000 UTC m=+66.295077436 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zw4cx" (UniqueName: "kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx") pod "network-check-target-75h8j" (UID: "a43bd7f9-1505-4e58-acda-ef8e398e302d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:46.914961 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:46.914938 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/33d32343-1781-48b5-bdcd-a04d2dec36da-original-pull-secret\") pod \"global-pull-secret-syncer-p54pj\" (UID: \"33d32343-1781-48b5-bdcd-a04d2dec36da\") " pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:47.131675 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.131634 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p54pj" Apr 23 13:32:47.206151 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.206114 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:32:47.211789 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.211626 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:32:47.211789 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.211661 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:47.211789 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.211690 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:47.719752 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.719717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:47.719752 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:47.719759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:47.719998 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:47.719871 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:47.719998 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:47.719889 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:47.719998 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:47.719951 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.719934187 +0000 UTC m=+37.102463298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:32:47.719998 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:47.719968 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.71996234 +0000 UTC m=+37.102491447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:32:49.057883 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:49.057838 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p54pj"] Apr 23 13:32:49.063072 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:32:49.063032 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33d32343_1781_48b5_bdcd_a04d2dec36da.slice/crio-12d68718569f5e22e9fc4e895d3b29ac5097a3e77efc03b1bf8fcfe7b8ddcfaa WatchSource:0}: Error finding container 12d68718569f5e22e9fc4e895d3b29ac5097a3e77efc03b1bf8fcfe7b8ddcfaa: Status 404 returned error can't find the container with id 12d68718569f5e22e9fc4e895d3b29ac5097a3e77efc03b1bf8fcfe7b8ddcfaa Apr 23 13:32:49.345526 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:49.345483 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p54pj" event={"ID":"33d32343-1781-48b5-bdcd-a04d2dec36da","Type":"ContainerStarted","Data":"12d68718569f5e22e9fc4e895d3b29ac5097a3e77efc03b1bf8fcfe7b8ddcfaa"} Apr 23 13:32:49.735178 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:49.735138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:49.735367 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:49.735188 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:49.735367 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:49.735340 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:49.735476 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:49.735427 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.735402437 +0000 UTC m=+41.117931567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:32:49.735476 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:49.735339 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:49.735578 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:49.735507 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.73548863 +0000 UTC m=+41.118017752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:32:50.351122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:50.351078 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="ad1e309190114e399de2844a2d3fac43eeebecb494253fac1427a83a7b37f39a" exitCode=0 Apr 23 13:32:50.351555 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:50.351162 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"ad1e309190114e399de2844a2d3fac43eeebecb494253fac1427a83a7b37f39a"} Apr 23 13:32:51.356242 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:51.356186 2567 generic.go:358] "Generic (PLEG): container finished" podID="cc296f8c-211e-4e05-8959-a5aba129cc83" containerID="efc66599ae9de64ee15a8bd2773efb2d9a6d2d615d2a22c1819d170b36b15dbb" exitCode=0 Apr 23 13:32:51.356673 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:51.356259 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerDied","Data":"efc66599ae9de64ee15a8bd2773efb2d9a6d2d615d2a22c1819d170b36b15dbb"} Apr 23 13:32:53.362601 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.362563 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w2589" event={"ID":"cc296f8c-211e-4e05-8959-a5aba129cc83","Type":"ContainerStarted","Data":"aebc54006c0278c8de07aaaf3d9fd63d822fc8ade95f4aa0ae37f94018e8a09f"} Apr 23 13:32:53.366709 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.366670 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p54pj" event={"ID":"33d32343-1781-48b5-bdcd-a04d2dec36da","Type":"ContainerStarted","Data":"060b03b9f8a34e6b60f212aa8840cfcc860469470bb18efe179af00457311c9c"} Apr 23 13:32:53.388198 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.388153 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w2589" podStartSLOduration=7.001415119 podStartE2EDuration="40.388140537s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.900399815 +0000 UTC m=+3.282928926" lastFinishedPulling="2026-04-23 13:32:49.287125236 +0000 UTC m=+36.669654344" observedRunningTime="2026-04-23 13:32:53.386208363 +0000 UTC m=+40.768737504" watchObservedRunningTime="2026-04-23 13:32:53.388140537 +0000 UTC m=+40.770669666" Apr 23 13:32:53.406620 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.406575 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-p54pj" podStartSLOduration=35.473500065 podStartE2EDuration="39.406562757s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:49.064878981 +0000 UTC m=+36.447408090" lastFinishedPulling="2026-04-23 13:32:52.997941671 +0000 UTC m=+40.380470782" observedRunningTime="2026-04-23 13:32:53.406333308 +0000 UTC m=+40.788862438" watchObservedRunningTime="2026-04-23 13:32:53.406562757 +0000 UTC m=+40.789091886" Apr 23 13:32:53.770942 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.770821 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:32:53.770942 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:32:53.770888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:32:53.771173 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:53.770969 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:53.771173 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:53.771001 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:53.771173 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:53.771045 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.771025424 +0000 UTC m=+49.153554546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:32:53.771173 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:32:53.771082 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.77106456 +0000 UTC m=+49.153593667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:33:01.827188 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:01.827133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:33:01.827188 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:01.827187 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:33:01.827711 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:01.827296 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:01.827711 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:01.827311 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:01.827711 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:01.827371 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:17.827357635 +0000 UTC m=+65.209886743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:33:01.827711 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:01.827386 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:17.827378809 +0000 UTC m=+65.209907917 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:33:10.334116 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:10.334087 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kqqbw" Apr 23 13:33:17.834119 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:17.834084 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:33:17.834119 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:17.834131 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:33:17.834639 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:17.834282 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:17.834639 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:17.834306 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:17.834639 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:17.834356 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:49.834335937 +0000 UTC m=+97.216865045 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:33:17.834639 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:17.834372 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:49.834365789 +0000 UTC m=+97.216894897 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:33:18.942655 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:18.942624 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:33:18.943136 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:18.942682 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:33:18.943136 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:18.942773 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:18.943136 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:18.942842 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.94282688 +0000 UTC m=+130.325355989 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : secret "metrics-daemon-secret" not found Apr 23 13:33:18.946218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:18.946198 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:18.956435 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:18.956414 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:18.967905 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:18.967883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4cx\" (UniqueName: \"kubernetes.io/projected/a43bd7f9-1505-4e58-acda-ef8e398e302d-kube-api-access-zw4cx\") pod \"network-check-target-75h8j\" (UID: \"a43bd7f9-1505-4e58-acda-ef8e398e302d\") " pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:33:19.019729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:19.019701 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:33:19.027180 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:19.027156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:33:19.168249 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:19.168204 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-75h8j"] Apr 23 13:33:19.172011 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:33:19.171973 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda43bd7f9_1505_4e58_acda_ef8e398e302d.slice/crio-560bc3cdbb080a75e599f1ae853551c4cc6cc5613f1655ed2984518c6ec746f8 WatchSource:0}: Error finding container 560bc3cdbb080a75e599f1ae853551c4cc6cc5613f1655ed2984518c6ec746f8: Status 404 returned error can't find the container with id 560bc3cdbb080a75e599f1ae853551c4cc6cc5613f1655ed2984518c6ec746f8 Apr 23 13:33:19.414600 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:19.414566 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-75h8j" event={"ID":"a43bd7f9-1505-4e58-acda-ef8e398e302d","Type":"ContainerStarted","Data":"560bc3cdbb080a75e599f1ae853551c4cc6cc5613f1655ed2984518c6ec746f8"} Apr 23 13:33:22.421542 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:22.421503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-75h8j" event={"ID":"a43bd7f9-1505-4e58-acda-ef8e398e302d","Type":"ContainerStarted","Data":"1ef507c1c5792b5e6875ab01b454a48512c972362427fb42e318d76752dc412a"} Apr 23 13:33:22.422041 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:22.421681 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:33:22.440164 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:22.440111 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-75h8j" podStartSLOduration=66.745219445 podStartE2EDuration="1m9.440097582s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:33:19.173808738 +0000 UTC m=+66.556337845" lastFinishedPulling="2026-04-23 13:33:21.86868687 +0000 UTC m=+69.251215982" observedRunningTime="2026-04-23 13:33:22.439580152 +0000 UTC m=+69.822109282" watchObservedRunningTime="2026-04-23 13:33:22.440097582 +0000 UTC m=+69.822626689" Apr 23 13:33:49.845007 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:49.844950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:33:49.845007 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:49.845017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:33:49.845490 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:49.845111 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:49.845490 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:49.845150 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:49.845490 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:49.845207 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls podName:7b86f405-4871-42b3-aa86-bde954086fa9 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:53.845186918 +0000 UTC m=+161.227716029 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls") pod "dns-default-2hdsq" (UID: "7b86f405-4871-42b3-aa86-bde954086fa9") : secret "dns-default-metrics-tls" not found Apr 23 13:33:49.845490 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:33:49.845243 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert podName:441292d6-55dd-4164-b5f7-2bdf45288757 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:53.845217763 +0000 UTC m=+161.227746871 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert") pod "ingress-canary-rhpd9" (UID: "441292d6-55dd-4164-b5f7-2bdf45288757") : secret "canary-serving-cert" not found Apr 23 13:33:53.426472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:33:53.426442 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-75h8j" Apr 23 13:34:06.956961 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.956928 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8"] Apr 23 13:34:06.959492 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.959477 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" Apr 23 13:34:06.962913 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.962887 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.963020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.962967 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.964186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.964167 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-dcmfn\"" Apr 23 13:34:06.966851 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.966831 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6"] Apr 23 13:34:06.968513 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.968493 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:06.973767 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.973747 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8"] Apr 23 13:34:06.976951 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.976928 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 13:34:06.976951 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.976941 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.981325 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.981304 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.981601 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:06.981584 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-j6rh4\"" Apr 23 13:34:07.000242 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.000210 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6"] Apr 23 13:34:07.067699 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.067663 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.067885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.067722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4nfx\" (UniqueName: \"kubernetes.io/projected/f6442128-0a86-4621-9b97-053c66c0c77c-kube-api-access-b4nfx\") pod \"volume-data-source-validator-7c6cbb6c87-kwjr8\" (UID: \"f6442128-0a86-4621-9b97-053c66c0c77c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" Apr 23 13:34:07.067885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.067745 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfm5\" (UniqueName: \"kubernetes.io/projected/05d35694-4030-4c24-97f6-e23c18bb45a2-kube-api-access-brfm5\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.078917 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.078883 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv"] Apr 23 13:34:07.080759 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.080743 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.083773 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.083751 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 13:34:07.084453 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.084265 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 13:34:07.084453 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.084368 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:07.084453 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.084368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 13:34:07.084967 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.084936 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-lqnxh\"" Apr 23 13:34:07.100688 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.100649 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv"] Apr 23 13:34:07.168411 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168368 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2357249-2dd2-4507-999a-e670d1aa527f-config\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.168411 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168466 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2357249-2dd2-4507-999a-e670d1aa527f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:07.168498 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:07.168548 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls podName:05d35694-4030-4c24-97f6-e23c18bb45a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:07.668534248 +0000 UTC m=+115.051063355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dcpx6" (UID: "05d35694-4030-4c24-97f6-e23c18bb45a2") : secret "samples-operator-tls" not found Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168562 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4nfx\" (UniqueName: \"kubernetes.io/projected/f6442128-0a86-4621-9b97-053c66c0c77c-kube-api-access-b4nfx\") pod \"volume-data-source-validator-7c6cbb6c87-kwjr8\" (UID: \"f6442128-0a86-4621-9b97-053c66c0c77c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brfm5\" (UniqueName: \"kubernetes.io/projected/05d35694-4030-4c24-97f6-e23c18bb45a2-kube-api-access-brfm5\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.168616 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.168601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxb4k\" (UniqueName: \"kubernetes.io/projected/e2357249-2dd2-4507-999a-e670d1aa527f-kube-api-access-qxb4k\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.182954 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.182920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4nfx\" (UniqueName: \"kubernetes.io/projected/f6442128-0a86-4621-9b97-053c66c0c77c-kube-api-access-b4nfx\") pod \"volume-data-source-validator-7c6cbb6c87-kwjr8\" (UID: \"f6442128-0a86-4621-9b97-053c66c0c77c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" Apr 23 13:34:07.183717 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.183699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfm5\" (UniqueName: \"kubernetes.io/projected/05d35694-4030-4c24-97f6-e23c18bb45a2-kube-api-access-brfm5\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.269729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.269639 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" Apr 23 13:34:07.269867 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.269811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxb4k\" (UniqueName: \"kubernetes.io/projected/e2357249-2dd2-4507-999a-e670d1aa527f-kube-api-access-qxb4k\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.269867 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.269854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2357249-2dd2-4507-999a-e670d1aa527f-config\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.269935 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.269890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2357249-2dd2-4507-999a-e670d1aa527f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.270610 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.270434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2357249-2dd2-4507-999a-e670d1aa527f-config\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.272029 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.272004 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2357249-2dd2-4507-999a-e670d1aa527f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.278654 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.278619 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxb4k\" (UniqueName: \"kubernetes.io/projected/e2357249-2dd2-4507-999a-e670d1aa527f-kube-api-access-qxb4k\") pod \"service-ca-operator-d6fc45fc5-rpgkv\" (UID: \"e2357249-2dd2-4507-999a-e670d1aa527f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.386051 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.386012 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8"] Apr 23 13:34:07.390190 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:07.390161 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6442128_0a86_4621_9b97_053c66c0c77c.slice/crio-ef03eb31ce772b271b8a11cb1a94e2b25ca7f02fec7aaef2f537e9da2e6589e6 WatchSource:0}: Error finding container ef03eb31ce772b271b8a11cb1a94e2b25ca7f02fec7aaef2f537e9da2e6589e6: Status 404 returned error can't find the container with id ef03eb31ce772b271b8a11cb1a94e2b25ca7f02fec7aaef2f537e9da2e6589e6 Apr 23 13:34:07.391026 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.391003 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" Apr 23 13:34:07.504139 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.504108 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" event={"ID":"f6442128-0a86-4621-9b97-053c66c0c77c","Type":"ContainerStarted","Data":"ef03eb31ce772b271b8a11cb1a94e2b25ca7f02fec7aaef2f537e9da2e6589e6"} Apr 23 13:34:07.505895 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.505867 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv"] Apr 23 13:34:07.508597 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:07.508571 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2357249_2dd2_4507_999a_e670d1aa527f.slice/crio-cffe0d2a73c57b7ba8dc285c86ecff6ab7900c5ef415bd092b1b81de88b008b5 WatchSource:0}: Error finding container cffe0d2a73c57b7ba8dc285c86ecff6ab7900c5ef415bd092b1b81de88b008b5: Status 404 returned error can't find the container with id cffe0d2a73c57b7ba8dc285c86ecff6ab7900c5ef415bd092b1b81de88b008b5 Apr 23 13:34:07.672901 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:07.672864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:07.673083 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:07.673013 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:07.673083 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:07.673074 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls podName:05d35694-4030-4c24-97f6-e23c18bb45a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:08.673057524 +0000 UTC m=+116.055586637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dcpx6" (UID: "05d35694-4030-4c24-97f6-e23c18bb45a2") : secret "samples-operator-tls" not found Apr 23 13:34:08.507894 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:08.507849 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" event={"ID":"e2357249-2dd2-4507-999a-e670d1aa527f","Type":"ContainerStarted","Data":"cffe0d2a73c57b7ba8dc285c86ecff6ab7900c5ef415bd092b1b81de88b008b5"} Apr 23 13:34:08.680943 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:08.680903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:08.681128 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:08.681079 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:08.681191 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:08.681159 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls podName:05d35694-4030-4c24-97f6-e23c18bb45a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:10.681136203 +0000 UTC m=+118.063665318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dcpx6" (UID: "05d35694-4030-4c24-97f6-e23c18bb45a2") : secret "samples-operator-tls" not found Apr 23 13:34:09.510458 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:09.510433 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" event={"ID":"f6442128-0a86-4621-9b97-053c66c0c77c","Type":"ContainerStarted","Data":"ba0e735c670d3ab08158361b2125f09c7f483501759423bdf6dfc9dd93adb02d"} Apr 23 13:34:09.528861 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:09.528814 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kwjr8" podStartSLOduration=2.379919019 podStartE2EDuration="3.528799564s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.392037711 +0000 UTC m=+114.774566819" lastFinishedPulling="2026-04-23 13:34:08.540918253 +0000 UTC m=+115.923447364" observedRunningTime="2026-04-23 13:34:09.528399307 +0000 UTC m=+116.910928442" watchObservedRunningTime="2026-04-23 13:34:09.528799564 +0000 UTC m=+116.911328697" Apr 23 13:34:10.513204 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:10.513164 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" event={"ID":"e2357249-2dd2-4507-999a-e670d1aa527f","Type":"ContainerStarted","Data":"23f81f783aee1cce8c184ae548f6c3a7bfe119b849e468ac86b52c7e15fcee19"} Apr 23 13:34:10.529507 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:10.529461 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" podStartSLOduration=1.534065089 podStartE2EDuration="3.529445207s" podCreationTimestamp="2026-04-23 13:34:07 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.510169409 +0000 UTC m=+114.892698517" lastFinishedPulling="2026-04-23 13:34:09.505549513 +0000 UTC m=+116.888078635" observedRunningTime="2026-04-23 13:34:10.528671469 +0000 UTC m=+117.911200599" watchObservedRunningTime="2026-04-23 13:34:10.529445207 +0000 UTC m=+117.911974384" Apr 23 13:34:10.695676 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:10.695635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:10.695835 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:10.695742 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:10.695835 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:10.695797 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls podName:05d35694-4030-4c24-97f6-e23c18bb45a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:14.695783369 +0000 UTC m=+122.078312476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dcpx6" (UID: "05d35694-4030-4c24-97f6-e23c18bb45a2") : secret "samples-operator-tls" not found Apr 23 13:34:11.051185 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.051149 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:34:11.053462 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.053439 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.056504 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.056476 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7pd9g\"" Apr 23 13:34:11.056643 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.056503 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:34:11.056643 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.056503 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:34:11.056643 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.056558 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:34:11.064852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.064832 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:34:11.066699 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.066676 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:34:11.099289 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099258 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099289 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099292 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099315 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099379 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099444 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.099646 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.099488 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjbn\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200136 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200094 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200136 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200133 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200286 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200318 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200396 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200364 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szjbn\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.200571 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.200408 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:11.200571 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.200422 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5658766989-n4pmr: secret "image-registry-tls" not found Apr 23 13:34:11.200571 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.200479 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls podName:c27144db-cd34-4e4d-8b1f-7dd4033de254 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:11.700458472 +0000 UTC m=+119.082987594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls") pod "image-registry-5658766989-n4pmr" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254") : secret "image-registry-tls" not found Apr 23 13:34:11.200889 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.200872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.201165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.201142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.201546 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.201524 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.202856 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.202827 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.203311 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.203295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.211335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.211310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.211425 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.211405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjbn\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.704113 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.704081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:11.704598 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.704280 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:11.704598 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.704305 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5658766989-n4pmr: secret "image-registry-tls" not found Apr 23 13:34:11.704598 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:11.704396 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls podName:c27144db-cd34-4e4d-8b1f-7dd4033de254 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:12.704371795 +0000 UTC m=+120.086900918 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls") pod "image-registry-5658766989-n4pmr" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254") : secret "image-registry-tls" not found Apr 23 13:34:11.888738 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.888704 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm"] Apr 23 13:34:11.891598 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.891579 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" Apr 23 13:34:11.894437 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.894414 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hq9rz\"" Apr 23 13:34:11.894552 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.894422 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 13:34:11.895629 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.895608 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:11.901387 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:11.901366 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm"] Apr 23 13:34:12.006109 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.006010 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5whl\" (UniqueName: \"kubernetes.io/projected/0fc36607-6fd1-4cc6-b3e3-61494fda3497-kube-api-access-s5whl\") pod \"migrator-74bb7799d9-8dscm\" (UID: \"0fc36607-6fd1-4cc6-b3e3-61494fda3497\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" Apr 23 13:34:12.106515 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.106475 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5whl\" (UniqueName: \"kubernetes.io/projected/0fc36607-6fd1-4cc6-b3e3-61494fda3497-kube-api-access-s5whl\") pod \"migrator-74bb7799d9-8dscm\" (UID: \"0fc36607-6fd1-4cc6-b3e3-61494fda3497\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" Apr 23 13:34:12.115136 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.115085 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5whl\" (UniqueName: \"kubernetes.io/projected/0fc36607-6fd1-4cc6-b3e3-61494fda3497-kube-api-access-s5whl\") pod \"migrator-74bb7799d9-8dscm\" (UID: \"0fc36607-6fd1-4cc6-b3e3-61494fda3497\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" Apr 23 13:34:12.200852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.200812 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" Apr 23 13:34:12.316266 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.316217 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm"] Apr 23 13:34:12.319672 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:12.319646 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc36607_6fd1_4cc6_b3e3_61494fda3497.slice/crio-9f21cbcd5b5dd1effc5b982a91707932d12ad232e97d520973b6d67a9291f913 WatchSource:0}: Error finding container 9f21cbcd5b5dd1effc5b982a91707932d12ad232e97d520973b6d67a9291f913: Status 404 returned error can't find the container with id 9f21cbcd5b5dd1effc5b982a91707932d12ad232e97d520973b6d67a9291f913 Apr 23 13:34:12.518398 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.518362 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" event={"ID":"0fc36607-6fd1-4cc6-b3e3-61494fda3497","Type":"ContainerStarted","Data":"9f21cbcd5b5dd1effc5b982a91707932d12ad232e97d520973b6d67a9291f913"} Apr 23 13:34:12.711994 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.711950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:12.712504 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:12.712113 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:12.712504 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:12.712135 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5658766989-n4pmr: secret "image-registry-tls" not found Apr 23 13:34:12.712504 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:12.712192 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls podName:c27144db-cd34-4e4d-8b1f-7dd4033de254 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:14.712173753 +0000 UTC m=+122.094702865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls") pod "image-registry-5658766989-n4pmr" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254") : secret "image-registry-tls" not found Apr 23 13:34:12.941441 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.941407 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-45rgk"] Apr 23 13:34:12.943590 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.943557 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:12.948119 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.948088 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 13:34:12.949071 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.949045 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 13:34:12.950102 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.950078 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bxmwj\"" Apr 23 13:34:12.950267 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.950119 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 13:34:12.950267 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.950153 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 13:34:12.968104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:12.968018 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-45rgk"] Apr 23 13:34:13.004643 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.004618 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4nhld_f6811f33-1a89-4437-950c-bdb29fcbc2f5/dns-node-resolver/0.log" Apr 23 13:34:13.014050 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.014021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61b6328e-e159-4084-9f82-79cda889e4b1-signing-cabundle\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.014212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.014056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vrh\" (UniqueName: \"kubernetes.io/projected/61b6328e-e159-4084-9f82-79cda889e4b1-kube-api-access-g8vrh\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.014212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.014093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61b6328e-e159-4084-9f82-79cda889e4b1-signing-key\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.115033 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.115001 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61b6328e-e159-4084-9f82-79cda889e4b1-signing-key\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.115274 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.115096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61b6328e-e159-4084-9f82-79cda889e4b1-signing-cabundle\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.115274 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.115146 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vrh\" (UniqueName: \"kubernetes.io/projected/61b6328e-e159-4084-9f82-79cda889e4b1-kube-api-access-g8vrh\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.118936 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.118885 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 13:34:13.119059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.118895 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 13:34:13.126010 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.125986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61b6328e-e159-4084-9f82-79cda889e4b1-signing-cabundle\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.126843 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.126823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 13:34:13.128255 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.128149 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61b6328e-e159-4084-9f82-79cda889e4b1-signing-key\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.136408 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.136384 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 13:34:13.146334 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.146310 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vrh\" (UniqueName: \"kubernetes.io/projected/61b6328e-e159-4084-9f82-79cda889e4b1-kube-api-access-g8vrh\") pod \"service-ca-865cb79987-45rgk\" (UID: \"61b6328e-e159-4084-9f82-79cda889e4b1\") " pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.258399 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.258315 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bxmwj\"" Apr 23 13:34:13.265808 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.265784 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-45rgk" Apr 23 13:34:13.517743 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.517701 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-45rgk"] Apr 23 13:34:13.523409 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:13.523381 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" event={"ID":"0fc36607-6fd1-4cc6-b3e3-61494fda3497","Type":"ContainerStarted","Data":"af770fc6485b9b3a09846baad3b461b661572aedf92a4af363c858ad5c0e7535"} Apr 23 13:34:13.530076 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:13.530049 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b6328e_e159_4084_9f82_79cda889e4b1.slice/crio-c7c7474c48f4292bdd60fd7173f028e2163a4c88871ef251c659204b14601491 WatchSource:0}: Error finding container c7c7474c48f4292bdd60fd7173f028e2163a4c88871ef251c659204b14601491: Status 404 returned error can't find the container with id c7c7474c48f4292bdd60fd7173f028e2163a4c88871ef251c659204b14601491 Apr 23 13:34:14.002453 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.002424 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5lgwg_a642a633-45bc-405e-899d-b28d88699e93/node-ca/0.log" Apr 23 13:34:14.526972 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.526933 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-45rgk" event={"ID":"61b6328e-e159-4084-9f82-79cda889e4b1","Type":"ContainerStarted","Data":"012256c8b08d4ab946da370f318d3dc7b63426da572b899c403a29f2d02df502"} Apr 23 13:34:14.527216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.526974 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-45rgk" event={"ID":"61b6328e-e159-4084-9f82-79cda889e4b1","Type":"ContainerStarted","Data":"c7c7474c48f4292bdd60fd7173f028e2163a4c88871ef251c659204b14601491"} Apr 23 13:34:14.528380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.528356 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" event={"ID":"0fc36607-6fd1-4cc6-b3e3-61494fda3497","Type":"ContainerStarted","Data":"0085251af091d33af5241310a3afbf9d739e06e3c85761d7742ffd8c8e80209e"} Apr 23 13:34:14.552797 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.550575 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-45rgk" podStartSLOduration=2.55055759 podStartE2EDuration="2.55055759s" podCreationTimestamp="2026-04-23 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:14.548828991 +0000 UTC m=+121.931358124" watchObservedRunningTime="2026-04-23 13:34:14.55055759 +0000 UTC m=+121.933086726" Apr 23 13:34:14.568695 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.568557 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-8dscm" podStartSLOduration=2.463172601 podStartE2EDuration="3.568541813s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.32156514 +0000 UTC m=+119.704094248" lastFinishedPulling="2026-04-23 13:34:13.426934351 +0000 UTC m=+120.809463460" observedRunningTime="2026-04-23 13:34:14.567666564 +0000 UTC m=+121.950195695" watchObservedRunningTime="2026-04-23 13:34:14.568541813 +0000 UTC m=+121.951070943" Apr 23 13:34:14.731898 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.731863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:14.732076 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:14.731945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:14.732076 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:14.732013 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:34:14.732076 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:14.732041 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:14.732076 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:14.732051 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5658766989-n4pmr: secret "image-registry-tls" not found Apr 23 13:34:14.732216 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:14.732083 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls podName:05d35694-4030-4c24-97f6-e23c18bb45a2 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.732065806 +0000 UTC m=+130.114594913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dcpx6" (UID: "05d35694-4030-4c24-97f6-e23c18bb45a2") : secret "samples-operator-tls" not found Apr 23 13:34:14.732216 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:14.732098 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls podName:c27144db-cd34-4e4d-8b1f-7dd4033de254 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:18.732091026 +0000 UTC m=+126.114620133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls") pod "image-registry-5658766989-n4pmr" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254") : secret "image-registry-tls" not found Apr 23 13:34:18.763787 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:18.763741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:18.764311 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:18.763918 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:34:18.764311 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:18.763946 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5658766989-n4pmr: secret "image-registry-tls" not found Apr 23 13:34:18.764311 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:18.764020 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls podName:c27144db-cd34-4e4d-8b1f-7dd4033de254 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:26.763997969 +0000 UTC m=+134.146527081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls") pod "image-registry-5658766989-n4pmr" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254") : secret "image-registry-tls" not found Apr 23 13:34:22.793694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:22.793636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:22.796016 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:22.795989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/05d35694-4030-4c24-97f6-e23c18bb45a2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dcpx6\" (UID: \"05d35694-4030-4c24-97f6-e23c18bb45a2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:22.881787 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:22.881756 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-j6rh4\"" Apr 23 13:34:22.889480 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:22.889446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" Apr 23 13:34:22.995335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:22.995298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:34:22.995482 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:22.995461 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:34:22.995551 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:22.995540 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs podName:69a4531d-2959-43b5-929f-9d7ddf10163b nodeName:}" failed. No retries permitted until 2026-04-23 13:36:24.995521873 +0000 UTC m=+252.378050982 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs") pod "network-metrics-daemon-c246c" (UID: "69a4531d-2959-43b5-929f-9d7ddf10163b") : secret "metrics-daemon-secret" not found Apr 23 13:34:23.012017 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:23.011990 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6"] Apr 23 13:34:23.555604 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:23.555564 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" event={"ID":"05d35694-4030-4c24-97f6-e23c18bb45a2","Type":"ContainerStarted","Data":"e719eb0f3e3b7e3e68c51d0005ce1093e1ee3983bba3b718fbe2af82b954cb53"} Apr 23 13:34:25.563099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:25.563057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" event={"ID":"05d35694-4030-4c24-97f6-e23c18bb45a2","Type":"ContainerStarted","Data":"3c31b199a9565652c389dd6cc1b9e8264ce8baa09590a6653261b64b297b2697"} Apr 23 13:34:25.563099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:25.563101 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" event={"ID":"05d35694-4030-4c24-97f6-e23c18bb45a2","Type":"ContainerStarted","Data":"5db8441284c8c23c18e50465c5bc0ca25c1c351d9fa18f5f099efb0da6edf006"} Apr 23 13:34:25.579474 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:25.579427 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dcpx6" podStartSLOduration=18.001461786 podStartE2EDuration="19.579413619s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="2026-04-23 13:34:23.056339861 +0000 UTC m=+130.438868978" lastFinishedPulling="2026-04-23 13:34:24.634291702 +0000 UTC m=+132.016820811" observedRunningTime="2026-04-23 13:34:25.578859257 +0000 UTC m=+132.961388391" watchObservedRunningTime="2026-04-23 13:34:25.579413619 +0000 UTC m=+132.961943078" Apr 23 13:34:26.825668 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:26.825629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:26.828014 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:26.827992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"image-registry-5658766989-n4pmr\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:26.966377 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:26.966343 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7pd9g\"" Apr 23 13:34:26.974681 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:26.974659 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:27.098645 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:27.098561 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:34:27.102830 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:27.102797 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27144db_cd34_4e4d_8b1f_7dd4033de254.slice/crio-05e67036ae7d822d12bccb86434ecab388e50359a4bf3eb7e627d28b1540ec76 WatchSource:0}: Error finding container 05e67036ae7d822d12bccb86434ecab388e50359a4bf3eb7e627d28b1540ec76: Status 404 returned error can't find the container with id 05e67036ae7d822d12bccb86434ecab388e50359a4bf3eb7e627d28b1540ec76 Apr 23 13:34:27.569721 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:27.569681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5658766989-n4pmr" event={"ID":"c27144db-cd34-4e4d-8b1f-7dd4033de254","Type":"ContainerStarted","Data":"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957"} Apr 23 13:34:27.569721 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:27.569716 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5658766989-n4pmr" event={"ID":"c27144db-cd34-4e4d-8b1f-7dd4033de254","Type":"ContainerStarted","Data":"05e67036ae7d822d12bccb86434ecab388e50359a4bf3eb7e627d28b1540ec76"} Apr 23 13:34:27.569919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:27.569818 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:34:27.591299 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:27.591249 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5658766989-n4pmr" podStartSLOduration=16.591219089 podStartE2EDuration="16.591219089s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:27.590448783 +0000 UTC m=+134.972977913" watchObservedRunningTime="2026-04-23 13:34:27.591219089 +0000 UTC m=+134.973748218" Apr 23 13:34:35.956281 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.956248 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zgjs5"] Apr 23 13:34:35.959684 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.959666 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:35.965659 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.965476 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:35.965659 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.965505 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hkznh\"" Apr 23 13:34:35.965775 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.965693 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:35.965775 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.965740 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:35.966165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.966147 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:35.983557 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:35.983532 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zgjs5"] Apr 23 13:34:36.011681 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.011645 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:34:36.093379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.093346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.093517 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.093391 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-data-volume\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.093517 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.093432 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-crio-socket\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.093517 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.093451 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.093631 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.093527 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwz58\" (UniqueName: \"kubernetes.io/projected/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-api-access-qwz58\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194426 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194426 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194433 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-data-volume\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194633 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-crio-socket\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194633 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194728 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194639 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwz58\" (UniqueName: \"kubernetes.io/projected/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-api-access-qwz58\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194728 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-crio-socket\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194822 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194758 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-data-volume\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.194993 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.194970 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.196986 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.196963 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.202756 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.202729 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwz58\" (UniqueName: \"kubernetes.io/projected/dfe0a743-56d3-48de-a1bd-7f4b26e628aa-kube-api-access-qwz58\") pod \"insights-runtime-extractor-zgjs5\" (UID: \"dfe0a743-56d3-48de-a1bd-7f4b26e628aa\") " pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.268999 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.268928 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zgjs5" Apr 23 13:34:36.390826 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.390791 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zgjs5"] Apr 23 13:34:36.394179 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:36.394147 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe0a743_56d3_48de_a1bd_7f4b26e628aa.slice/crio-cb7b39c03ef175db30c042879a4e05005bd7d5780c979a7619952788fa907b2a WatchSource:0}: Error finding container cb7b39c03ef175db30c042879a4e05005bd7d5780c979a7619952788fa907b2a: Status 404 returned error can't find the container with id cb7b39c03ef175db30c042879a4e05005bd7d5780c979a7619952788fa907b2a Apr 23 13:34:36.592260 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.592150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgjs5" event={"ID":"dfe0a743-56d3-48de-a1bd-7f4b26e628aa","Type":"ContainerStarted","Data":"9c4691ea28d741ac4c9963bfb92c7e122c11a46ed51505041ff33af47a639469"} Apr 23 13:34:36.592260 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:36.592188 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgjs5" event={"ID":"dfe0a743-56d3-48de-a1bd-7f4b26e628aa","Type":"ContainerStarted","Data":"cb7b39c03ef175db30c042879a4e05005bd7d5780c979a7619952788fa907b2a"} Apr 23 13:34:37.596561 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:37.596530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgjs5" event={"ID":"dfe0a743-56d3-48de-a1bd-7f4b26e628aa","Type":"ContainerStarted","Data":"21a3e7a1cb85533cb00de4bd17b78dd4024231a6a9ecf1a84be74a265a94100e"} Apr 23 13:34:38.600457 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:38.600377 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zgjs5" event={"ID":"dfe0a743-56d3-48de-a1bd-7f4b26e628aa","Type":"ContainerStarted","Data":"0f9da9e9566f3c593b46234fb5facd97304cd3a1b8f9962ecbf52a7e32a43ba4"} Apr 23 13:34:38.621290 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:38.621243 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zgjs5" podStartSLOduration=1.856567997 podStartE2EDuration="3.621215487s" podCreationTimestamp="2026-04-23 13:34:35 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.445969191 +0000 UTC m=+143.828498298" lastFinishedPulling="2026-04-23 13:34:38.210616667 +0000 UTC m=+145.593145788" observedRunningTime="2026-04-23 13:34:38.619389805 +0000 UTC m=+146.001918935" watchObservedRunningTime="2026-04-23 13:34:38.621215487 +0000 UTC m=+146.003744616" Apr 23 13:34:40.992025 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:40.991992 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q"] Apr 23 13:34:40.996876 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:40.996859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:40.999673 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:40.999648 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 13:34:40.999783 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:40.999649 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mrxvx\"" Apr 23 13:34:41.004821 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.004796 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q"] Apr 23 13:34:41.134489 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.134457 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/30a8fc37-9e56-46cd-92ab-f911c49adc81-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bt29q\" (UID: \"30a8fc37-9e56-46cd-92ab-f911c49adc81\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:41.235308 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.235271 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/30a8fc37-9e56-46cd-92ab-f911c49adc81-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bt29q\" (UID: \"30a8fc37-9e56-46cd-92ab-f911c49adc81\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:41.237739 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.237714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/30a8fc37-9e56-46cd-92ab-f911c49adc81-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bt29q\" (UID: \"30a8fc37-9e56-46cd-92ab-f911c49adc81\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:41.305739 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.305633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:41.419261 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.419211 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q"] Apr 23 13:34:41.423814 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:41.423788 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a8fc37_9e56_46cd_92ab_f911c49adc81.slice/crio-1707185297f98811117bb55c0fb733e636afea36dc161afe3580299d31c11296 WatchSource:0}: Error finding container 1707185297f98811117bb55c0fb733e636afea36dc161afe3580299d31c11296: Status 404 returned error can't find the container with id 1707185297f98811117bb55c0fb733e636afea36dc161afe3580299d31c11296 Apr 23 13:34:41.609655 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:41.609616 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" event={"ID":"30a8fc37-9e56-46cd-92ab-f911c49adc81","Type":"ContainerStarted","Data":"1707185297f98811117bb55c0fb733e636afea36dc161afe3580299d31c11296"} Apr 23 13:34:42.613291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:42.613258 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" event={"ID":"30a8fc37-9e56-46cd-92ab-f911c49adc81","Type":"ContainerStarted","Data":"0e6525979e651936577556947bd65fd9711d78ff71d04d575683fa7b27c56504"} Apr 23 13:34:42.613687 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:42.613459 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:42.618059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:42.618039 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" Apr 23 13:34:42.632189 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:42.632142 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bt29q" podStartSLOduration=1.574657234 podStartE2EDuration="2.632131032s" podCreationTimestamp="2026-04-23 13:34:40 +0000 UTC" firstStartedPulling="2026-04-23 13:34:41.426042396 +0000 UTC m=+148.808571507" lastFinishedPulling="2026-04-23 13:34:42.483516194 +0000 UTC m=+149.866045305" observedRunningTime="2026-04-23 13:34:42.631510052 +0000 UTC m=+150.014039183" watchObservedRunningTime="2026-04-23 13:34:42.632131032 +0000 UTC m=+150.014660162" Apr 23 13:34:43.047114 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.047083 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ml98b"] Apr 23 13:34:43.050084 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.050067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.056193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.055080 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:43.056193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.055251 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 13:34:43.056193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.055253 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 13:34:43.056193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.055347 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:43.056193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.055087 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:43.057097 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.057050 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h9zz8\"" Apr 23 13:34:43.059758 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.059735 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ml98b"] Apr 23 13:34:43.152340 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.152309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.152340 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.152344 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4799\" (UniqueName: \"kubernetes.io/projected/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-kube-api-access-z4799\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.152551 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.152450 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.152551 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.152484 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.253139 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.253100 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.253139 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.253144 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.253387 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.253328 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.253387 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.253378 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4799\" (UniqueName: \"kubernetes.io/projected/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-kube-api-access-z4799\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.253886 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.253861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.255688 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.255666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.255787 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.255710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.262724 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.262702 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4799\" (UniqueName: \"kubernetes.io/projected/fdbde0a6-7fe0-4fbb-8a91-766140234fc7-kube-api-access-z4799\") pod \"prometheus-operator-5676c8c784-ml98b\" (UID: \"fdbde0a6-7fe0-4fbb-8a91-766140234fc7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.365790 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.365757 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" Apr 23 13:34:43.481619 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.481583 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ml98b"] Apr 23 13:34:43.484743 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:43.484718 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdbde0a6_7fe0_4fbb_8a91_766140234fc7.slice/crio-92e5d4bfb96e0f2147392726c509add717ddf7debc41bd6533edc59d63d341d3 WatchSource:0}: Error finding container 92e5d4bfb96e0f2147392726c509add717ddf7debc41bd6533edc59d63d341d3: Status 404 returned error can't find the container with id 92e5d4bfb96e0f2147392726c509add717ddf7debc41bd6533edc59d63d341d3 Apr 23 13:34:43.616838 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:43.616736 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" event={"ID":"fdbde0a6-7fe0-4fbb-8a91-766140234fc7","Type":"ContainerStarted","Data":"92e5d4bfb96e0f2147392726c509add717ddf7debc41bd6533edc59d63d341d3"} Apr 23 13:34:45.623269 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:45.623214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" event={"ID":"fdbde0a6-7fe0-4fbb-8a91-766140234fc7","Type":"ContainerStarted","Data":"221f1d34fb81b89e2cd1702c9df6bdae019b1279df61d9846a9a09f6cebec3e3"} Apr 23 13:34:45.623269 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:45.623265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" event={"ID":"fdbde0a6-7fe0-4fbb-8a91-766140234fc7","Type":"ContainerStarted","Data":"8c7e0d98b86b0ebacc2aee3d878a27df3f9fce5e69083916e6142a00ca14acc9"} Apr 23 13:34:45.640677 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:45.640620 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ml98b" podStartSLOduration=1.456336122 podStartE2EDuration="2.640607329s" podCreationTimestamp="2026-04-23 13:34:43 +0000 UTC" firstStartedPulling="2026-04-23 13:34:43.486665396 +0000 UTC m=+150.869194504" lastFinishedPulling="2026-04-23 13:34:44.670936604 +0000 UTC m=+152.053465711" observedRunningTime="2026-04-23 13:34:45.639271702 +0000 UTC m=+153.021800833" watchObservedRunningTime="2026-04-23 13:34:45.640607329 +0000 UTC m=+153.023136458" Apr 23 13:34:46.017443 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:46.017354 2567 patch_prober.go:28] interesting pod/image-registry-5658766989-n4pmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:46.017582 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:46.017416 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5658766989-n4pmr" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:47.432813 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.432781 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bqhpf"] Apr 23 13:34:47.436208 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.436185 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.438915 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.438892 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:47.439045 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.438990 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 13:34:47.440366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.440346 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2zdkd\"" Apr 23 13:34:47.440478 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.440403 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 13:34:47.447212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.447190 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tf56p"] Apr 23 13:34:47.451086 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.451068 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bqhpf"] Apr 23 13:34:47.451220 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.451208 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.453855 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.453838 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:47.453961 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.453845 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:47.454021 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.453961 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:47.454114 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.454096 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2z86s\"" Apr 23 13:34:47.486643 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def66f8e-82f1-4a4a-9ee8-e407c64ef503-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.486815 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjxfx\" (UniqueName: \"kubernetes.io/projected/a919ee68-d491-466e-afde-2aabacadceda-kube-api-access-jjxfx\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.486815 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486676 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fjl\" (UniqueName: \"kubernetes.io/projected/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-api-access-d7fjl\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.486815 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-sys\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.486815 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486764 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.486995 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486851 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-metrics-client-ca\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.486995 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486918 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-textfile\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.486995 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.486954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.487121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487015 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.487121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487059 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-wtmp\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.487121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-accelerators-collector-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.487298 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.487348 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487291 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.487348 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487325 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-root\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.487431 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.487359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.588194 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588156 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-textfile\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588370 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588203 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.588429 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588385 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588498 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-wtmp\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588498 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-accelerators-collector-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588607 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:47.588497 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:47.588607 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:47.588565 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls podName:a919ee68-d491-466e-afde-2aabacadceda nodeName:}" failed. No retries permitted until 2026-04-23 13:34:48.088544401 +0000 UTC m=+155.471073523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls") pod "node-exporter-tf56p" (UID: "a919ee68-d491-466e-afde-2aabacadceda") : secret "node-exporter-tls" not found Apr 23 13:34:47.588607 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-textfile\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588766 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588618 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-wtmp\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588766 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.588766 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588696 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.588766 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-root\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.588766 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588813 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def66f8e-82f1-4a4a-9ee8-e407c64ef503-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588840 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjxfx\" (UniqueName: \"kubernetes.io/projected/a919ee68-d491-466e-afde-2aabacadceda-kube-api-access-jjxfx\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588873 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fjl\" (UniqueName: \"kubernetes.io/projected/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-api-access-d7fjl\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-sys\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.588978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-metrics-client-ca\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589020 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589006 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-accelerators-collector-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589460 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589460 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589094 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-root\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589460 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589387 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589460 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589421 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/def66f8e-82f1-4a4a-9ee8-e407c64ef503-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.589648 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a919ee68-d491-466e-afde-2aabacadceda-sys\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.589648 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.589503 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a919ee68-d491-466e-afde-2aabacadceda-metrics-client-ca\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.591562 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.591518 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.591562 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.591547 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.591964 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.591942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.597324 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.597300 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjxfx\" (UniqueName: \"kubernetes.io/projected/a919ee68-d491-466e-afde-2aabacadceda-kube-api-access-jjxfx\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:47.597566 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.597546 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fjl\" (UniqueName: \"kubernetes.io/projected/def66f8e-82f1-4a4a-9ee8-e407c64ef503-kube-api-access-d7fjl\") pod \"kube-state-metrics-69db897b98-bqhpf\" (UID: \"def66f8e-82f1-4a4a-9ee8-e407c64ef503\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.749382 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.749290 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" Apr 23 13:34:47.875482 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:47.875448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bqhpf"] Apr 23 13:34:47.879617 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:47.879582 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef66f8e_82f1_4a4a_9ee8_e407c64ef503.slice/crio-7b40c8d14efbb4ab87a4096dfca47230011ff88ff12c6e1846ea004350fcc8c5 WatchSource:0}: Error finding container 7b40c8d14efbb4ab87a4096dfca47230011ff88ff12c6e1846ea004350fcc8c5: Status 404 returned error can't find the container with id 7b40c8d14efbb4ab87a4096dfca47230011ff88ff12c6e1846ea004350fcc8c5 Apr 23 13:34:48.092731 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:48.092634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:48.094879 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:48.094856 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a919ee68-d491-466e-afde-2aabacadceda-node-exporter-tls\") pod \"node-exporter-tf56p\" (UID: \"a919ee68-d491-466e-afde-2aabacadceda\") " pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:48.360620 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:48.360218 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tf56p" Apr 23 13:34:48.371135 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:48.370836 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda919ee68_d491_466e_afde_2aabacadceda.slice/crio-c497da0bd1c0217b63a9fbbfffc76bd4f0766fc2a36ff5f681ef0c9f2003ac56 WatchSource:0}: Error finding container c497da0bd1c0217b63a9fbbfffc76bd4f0766fc2a36ff5f681ef0c9f2003ac56: Status 404 returned error can't find the container with id c497da0bd1c0217b63a9fbbfffc76bd4f0766fc2a36ff5f681ef0c9f2003ac56 Apr 23 13:34:48.633520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:48.633417 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tf56p" event={"ID":"a919ee68-d491-466e-afde-2aabacadceda","Type":"ContainerStarted","Data":"c497da0bd1c0217b63a9fbbfffc76bd4f0766fc2a36ff5f681ef0c9f2003ac56"} Apr 23 13:34:48.634666 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:48.634636 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" event={"ID":"def66f8e-82f1-4a4a-9ee8-e407c64ef503","Type":"ContainerStarted","Data":"7b40c8d14efbb4ab87a4096dfca47230011ff88ff12c6e1846ea004350fcc8c5"} Apr 23 13:34:48.991645 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:48.991547 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2hdsq" podUID="7b86f405-4871-42b3-aa86-bde954086fa9" Apr 23 13:34:49.007931 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:49.007892 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rhpd9" podUID="441292d6-55dd-4164-b5f7-2bdf45288757" Apr 23 13:34:49.227204 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:49.227157 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-c246c" podUID="69a4531d-2959-43b5-929f-9d7ddf10163b" Apr 23 13:34:49.638654 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.638617 2567 generic.go:358] "Generic (PLEG): container finished" podID="a919ee68-d491-466e-afde-2aabacadceda" containerID="4bcc928058c13b078b7688bbb50ce76a2336cd68b928e9592856103abd7f3a96" exitCode=0 Apr 23 13:34:49.639121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.638702 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tf56p" event={"ID":"a919ee68-d491-466e-afde-2aabacadceda","Type":"ContainerDied","Data":"4bcc928058c13b078b7688bbb50ce76a2336cd68b928e9592856103abd7f3a96"} Apr 23 13:34:49.640753 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.640732 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:34:49.640753 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.640739 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" event={"ID":"def66f8e-82f1-4a4a-9ee8-e407c64ef503","Type":"ContainerStarted","Data":"8dba96eee931812e7db40e6a94c2d18f9fa2f1a6c1811b74316cfc0d65daa9c7"} Apr 23 13:34:49.640937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.640770 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" event={"ID":"def66f8e-82f1-4a4a-9ee8-e407c64ef503","Type":"ContainerStarted","Data":"ca1cc08a84b36b2e4c5b87089f64f54fb690a8dbe3f451294c834f06c6e93d08"} Apr 23 13:34:49.640937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.640795 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" event={"ID":"def66f8e-82f1-4a4a-9ee8-e407c64ef503","Type":"ContainerStarted","Data":"b9bff0eee27c4131e098a049e1db44cf07fa394d3e801ba352cf89183896ea81"} Apr 23 13:34:49.640937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.640912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2hdsq" Apr 23 13:34:49.682800 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:49.682745 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-bqhpf" podStartSLOduration=1.361378156 podStartE2EDuration="2.682727574s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="2026-04-23 13:34:47.881477463 +0000 UTC m=+155.264006571" lastFinishedPulling="2026-04-23 13:34:49.202826882 +0000 UTC m=+156.585355989" observedRunningTime="2026-04-23 13:34:49.681954979 +0000 UTC m=+157.064484110" watchObservedRunningTime="2026-04-23 13:34:49.682727574 +0000 UTC m=+157.065256704" Apr 23 13:34:50.645059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:50.645013 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tf56p" event={"ID":"a919ee68-d491-466e-afde-2aabacadceda","Type":"ContainerStarted","Data":"489a1c6f961d0a1ad8176c656abcf0d2cc5eb52cb276d90097bc9cbb444d5adc"} Apr 23 13:34:50.645059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:50.645069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tf56p" event={"ID":"a919ee68-d491-466e-afde-2aabacadceda","Type":"ContainerStarted","Data":"bf7e06cf42a5c6971ac495a1595f36bef5b9a08aa51326f71dbf5e4b84ce7928"} Apr 23 13:34:50.667755 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:50.667698 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tf56p" podStartSLOduration=2.835827653 podStartE2EDuration="3.667681362s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="2026-04-23 13:34:48.373135014 +0000 UTC m=+155.755664128" lastFinishedPulling="2026-04-23 13:34:49.204988725 +0000 UTC m=+156.587517837" observedRunningTime="2026-04-23 13:34:50.665775575 +0000 UTC m=+158.048304705" watchObservedRunningTime="2026-04-23 13:34:50.667681362 +0000 UTC m=+158.050210514" Apr 23 13:34:52.206984 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.206949 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-x877v"] Apr 23 13:34:52.213025 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.212993 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:52.216052 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.216006 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cv4t7\"" Apr 23 13:34:52.216274 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.216250 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 13:34:52.219086 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.219062 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-x877v"] Apr 23 13:34:52.228698 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.228673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-x877v\" (UID: \"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:52.329871 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.329831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-x877v\" (UID: \"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:52.330153 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:52.330123 2567 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 23 13:34:52.330408 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:34:52.330394 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert podName:a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:52.830369647 +0000 UTC m=+160.212898758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-x877v" (UID: "a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1") : secret "monitoring-plugin-cert" not found Apr 23 13:34:52.833309 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.833275 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-x877v\" (UID: \"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:52.835761 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:52.835737 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-x877v\" (UID: \"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:53.123451 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.123420 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:53.240334 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.240306 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-x877v"] Apr 23 13:34:53.243169 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:53.243144 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57cae69_bdd0_49ee_b9e0_c6b7a5cb4ff1.slice/crio-ce28b70b13aa06e540617bc4695df58384f555418db5e947d9bee0004008aa5d WatchSource:0}: Error finding container ce28b70b13aa06e540617bc4695df58384f555418db5e947d9bee0004008aa5d: Status 404 returned error can't find the container with id ce28b70b13aa06e540617bc4695df58384f555418db5e947d9bee0004008aa5d Apr 23 13:34:53.655521 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.655474 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" event={"ID":"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1","Type":"ContainerStarted","Data":"ce28b70b13aa06e540617bc4695df58384f555418db5e947d9bee0004008aa5d"} Apr 23 13:34:53.660808 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.660773 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:53.665950 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.665930 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.669220 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.669191 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:34:53.669469 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.669450 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:34:53.669647 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.669630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4de62imb3slip\"" Apr 23 13:34:53.669825 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.669803 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wrfk7\"" Apr 23 13:34:53.669944 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.669922 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:34:53.670693 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670647 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:34:53.670693 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:34:53.670693 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670673 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:34:53.670975 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670697 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:34:53.670975 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670894 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:53.670975 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.670900 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:34:53.671541 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.671377 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:34:53.671541 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.671400 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:34:53.671713 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.671598 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:34:53.673033 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.673013 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:34:53.677246 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.677204 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:53.740997 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.740955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741005 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741035 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741213 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwgj\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741273 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741303 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741366 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741382 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741450 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741465 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741488 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741502 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741584 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741579 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.741961 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.741603 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.843407 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.842925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843687 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwgj\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843809 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843841 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843864 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843928 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843950 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.843994 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844106 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.844496 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844136 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.845449 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844877 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.845449 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.844918 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.847302 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.847275 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.848121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.847646 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.848121 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.847939 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850011 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.849401 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850011 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.849583 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850011 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.849858 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850011 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.849951 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.850128 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.850282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.850275 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.851130 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.851109 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.851373 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.851352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.855916 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.855886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.856007 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.855953 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.860202 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.860173 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.860298 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.860216 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwgj\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.860727 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.860708 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out\") pod \"prometheus-k8s-0\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:53.945143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.944936 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:34:53.945329 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.945198 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:34:53.947913 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.947884 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7b86f405-4871-42b3-aa86-bde954086fa9-metrics-tls\") pod \"dns-default-2hdsq\" (UID: \"7b86f405-4871-42b3-aa86-bde954086fa9\") " pod="openshift-dns/dns-default-2hdsq" Apr 23 13:34:53.948773 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.948747 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/441292d6-55dd-4164-b5f7-2bdf45288757-cert\") pod \"ingress-canary-rhpd9\" (UID: \"441292d6-55dd-4164-b5f7-2bdf45288757\") " pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:34:53.979581 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:53.979547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:54.137481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.137448 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:54.141988 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:54.141959 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c9a816a_e687_49b3_b624_6c95f100e264.slice/crio-4326838413c6cc2dfa2ebb3c61d6d57cfdf3a8a4d57e9466064e8859adeaedf0 WatchSource:0}: Error finding container 4326838413c6cc2dfa2ebb3c61d6d57cfdf3a8a4d57e9466064e8859adeaedf0: Status 404 returned error can't find the container with id 4326838413c6cc2dfa2ebb3c61d6d57cfdf3a8a4d57e9466064e8859adeaedf0 Apr 23 13:34:54.144450 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.144418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:34:54.144503 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.144482 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:34:54.151922 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.151899 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2hdsq" Apr 23 13:34:54.152199 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.152179 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rhpd9" Apr 23 13:34:54.293295 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.293267 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rhpd9"] Apr 23 13:34:54.305432 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.305402 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2hdsq"] Apr 23 13:34:54.616615 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:54.616573 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441292d6_55dd_4164_b5f7_2bdf45288757.slice/crio-552a31a7b45a6e7b912a7bd4804db66c7d1c80f6d4fc124af1ded0afd8ca0d28 WatchSource:0}: Error finding container 552a31a7b45a6e7b912a7bd4804db66c7d1c80f6d4fc124af1ded0afd8ca0d28: Status 404 returned error can't find the container with id 552a31a7b45a6e7b912a7bd4804db66c7d1c80f6d4fc124af1ded0afd8ca0d28 Apr 23 13:34:54.617295 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:34:54.617257 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b86f405_4871_42b3_aa86_bde954086fa9.slice/crio-f40449ef3cbe965701f61c5285e245200a2b804b66b012251792b15695b2f095 WatchSource:0}: Error finding container f40449ef3cbe965701f61c5285e245200a2b804b66b012251792b15695b2f095: Status 404 returned error can't find the container with id f40449ef3cbe965701f61c5285e245200a2b804b66b012251792b15695b2f095 Apr 23 13:34:54.658796 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.658755 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhpd9" event={"ID":"441292d6-55dd-4164-b5f7-2bdf45288757","Type":"ContainerStarted","Data":"552a31a7b45a6e7b912a7bd4804db66c7d1c80f6d4fc124af1ded0afd8ca0d28"} Apr 23 13:34:54.659832 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.659803 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"4326838413c6cc2dfa2ebb3c61d6d57cfdf3a8a4d57e9466064e8859adeaedf0"} Apr 23 13:34:54.660756 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:54.660733 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2hdsq" event={"ID":"7b86f405-4871-42b3-aa86-bde954086fa9","Type":"ContainerStarted","Data":"f40449ef3cbe965701f61c5285e245200a2b804b66b012251792b15695b2f095"} Apr 23 13:34:55.665967 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.665921 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" event={"ID":"a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1","Type":"ContainerStarted","Data":"9cacb58b05ebee9e920eae4272736632f04381483938c29f6195a5973a95aaea"} Apr 23 13:34:55.666456 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.666377 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:55.668302 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.668254 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d" exitCode=0 Apr 23 13:34:55.668436 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.668335 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d"} Apr 23 13:34:55.672707 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.672666 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" Apr 23 13:34:55.708690 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:55.708632 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-x877v" podStartSLOduration=2.29087752 podStartE2EDuration="3.708612254s" podCreationTimestamp="2026-04-23 13:34:52 +0000 UTC" firstStartedPulling="2026-04-23 13:34:53.24500641 +0000 UTC m=+160.627535521" lastFinishedPulling="2026-04-23 13:34:54.662741148 +0000 UTC m=+162.045270255" observedRunningTime="2026-04-23 13:34:55.681972092 +0000 UTC m=+163.064501223" watchObservedRunningTime="2026-04-23 13:34:55.708612254 +0000 UTC m=+163.091141390" Apr 23 13:34:56.016729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:56.016638 2567 patch_prober.go:28] interesting pod/image-registry-5658766989-n4pmr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:56.016729 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:56.016704 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5658766989-n4pmr" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:56.678402 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:56.678360 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rhpd9" event={"ID":"441292d6-55dd-4164-b5f7-2bdf45288757","Type":"ContainerStarted","Data":"0672f5bdee1f826cf58d96d7f021411eacffbb5bc6e55f2deb27e57f90f41fc7"} Apr 23 13:34:56.696737 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:56.696675 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rhpd9" podStartSLOduration=129.735740302 podStartE2EDuration="2m11.696656987s" podCreationTimestamp="2026-04-23 13:32:45 +0000 UTC" firstStartedPulling="2026-04-23 13:34:54.618577533 +0000 UTC m=+162.001106657" lastFinishedPulling="2026-04-23 13:34:56.579494221 +0000 UTC m=+163.962023342" observedRunningTime="2026-04-23 13:34:56.695783848 +0000 UTC m=+164.078312978" watchObservedRunningTime="2026-04-23 13:34:56.696656987 +0000 UTC m=+164.079186117" Apr 23 13:34:57.684195 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:57.684149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2hdsq" event={"ID":"7b86f405-4871-42b3-aa86-bde954086fa9","Type":"ContainerStarted","Data":"41d6fd08edc95728e97747b89d07034f1a73a9e3245d9f55eb434a67ef120ab6"} Apr 23 13:34:57.684195 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:57.684197 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2hdsq" event={"ID":"7b86f405-4871-42b3-aa86-bde954086fa9","Type":"ContainerStarted","Data":"400d89257a106c0a59df2db5a65122459f7027cb144258b50b6977aeaf8db0fa"} Apr 23 13:34:57.704529 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:57.704479 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2hdsq" podStartSLOduration=130.742510123 podStartE2EDuration="2m12.704460812s" podCreationTimestamp="2026-04-23 13:32:45 +0000 UTC" firstStartedPulling="2026-04-23 13:34:54.619215802 +0000 UTC m=+162.001744910" lastFinishedPulling="2026-04-23 13:34:56.581166492 +0000 UTC m=+163.963695599" observedRunningTime="2026-04-23 13:34:57.701925884 +0000 UTC m=+165.084455016" watchObservedRunningTime="2026-04-23 13:34:57.704460812 +0000 UTC m=+165.086989944" Apr 23 13:34:58.687718 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:58.687682 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2hdsq" Apr 23 13:34:59.694247 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:59.694117 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2"} Apr 23 13:34:59.694247 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:34:59.694195 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3"} Apr 23 13:35:00.205455 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:00.205412 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:35:01.032101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.031880 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5658766989-n4pmr" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" containerID="cri-o://d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957" gracePeriod=30 Apr 23 13:35:01.261343 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.261322 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:35:01.318046 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318016 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318181 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318059 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318181 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318090 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318181 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318159 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318345 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318203 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318345 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318256 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318345 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318282 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318345 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318313 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szjbn\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn\") pod \"c27144db-cd34-4e4d-8b1f-7dd4033de254\" (UID: \"c27144db-cd34-4e4d-8b1f-7dd4033de254\") " Apr 23 13:35:01.318800 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318769 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:01.319010 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.318959 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:01.320523 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.320495 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:01.320523 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.320514 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:01.320710 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.320687 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:01.320953 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.320923 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn" (OuterVolumeSpecName: "kube-api-access-szjbn") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "kube-api-access-szjbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:01.321122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.321100 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:01.327069 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.327042 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c27144db-cd34-4e4d-8b1f-7dd4033de254" (UID: "c27144db-cd34-4e4d-8b1f-7dd4033de254"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419456 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-installation-pull-secrets\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419491 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419500 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-bound-sa-token\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419509 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-trusted-ca\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419518 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c27144db-cd34-4e4d-8b1f-7dd4033de254-ca-trust-extracted\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419529 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c27144db-cd34-4e4d-8b1f-7dd4033de254-image-registry-private-configuration\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419538 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c27144db-cd34-4e4d-8b1f-7dd4033de254-registry-certificates\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.419547 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.419547 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szjbn\" (UniqueName: \"kubernetes.io/projected/c27144db-cd34-4e4d-8b1f-7dd4033de254-kube-api-access-szjbn\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:35:01.701910 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.701814 2567 generic.go:358] "Generic (PLEG): container finished" podID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerID="d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957" exitCode=0 Apr 23 13:35:01.701910 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.701880 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5658766989-n4pmr" Apr 23 13:35:01.701910 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.701900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5658766989-n4pmr" event={"ID":"c27144db-cd34-4e4d-8b1f-7dd4033de254","Type":"ContainerDied","Data":"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957"} Apr 23 13:35:01.702160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.701937 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5658766989-n4pmr" event={"ID":"c27144db-cd34-4e4d-8b1f-7dd4033de254","Type":"ContainerDied","Data":"05e67036ae7d822d12bccb86434ecab388e50359a4bf3eb7e627d28b1540ec76"} Apr 23 13:35:01.702160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.701956 2567 scope.go:117] "RemoveContainer" containerID="d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957" Apr 23 13:35:01.705089 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.705064 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac"} Apr 23 13:35:01.705212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.705094 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d"} Apr 23 13:35:01.705212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.705105 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9"} Apr 23 13:35:01.705212 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.705114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerStarted","Data":"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5"} Apr 23 13:35:01.710895 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.710876 2567 scope.go:117] "RemoveContainer" containerID="d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957" Apr 23 13:35:01.711165 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:35:01.711144 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957\": container with ID starting with d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957 not found: ID does not exist" containerID="d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957" Apr 23 13:35:01.711273 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.711177 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957"} err="failed to get container status \"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957\": rpc error: code = NotFound desc = could not find container \"d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957\": container with ID starting with d8853f72a508b6a000ed6e56bf738ddad8bf8b80ea649cf847d0bf7bdc5c6957 not found: ID does not exist" Apr 23 13:35:01.734639 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.734577 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.974904457 podStartE2EDuration="8.734559929s" podCreationTimestamp="2026-04-23 13:34:53 +0000 UTC" firstStartedPulling="2026-04-23 13:34:54.144696131 +0000 UTC m=+161.527225254" lastFinishedPulling="2026-04-23 13:35:00.9043516 +0000 UTC m=+168.286880726" observedRunningTime="2026-04-23 13:35:01.731767165 +0000 UTC m=+169.114296288" watchObservedRunningTime="2026-04-23 13:35:01.734559929 +0000 UTC m=+169.117089058" Apr 23 13:35:01.744434 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.744410 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:35:01.747929 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:01.747907 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5658766989-n4pmr"] Apr 23 13:35:03.211737 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:03.211689 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" path="/var/lib/kubelet/pods/c27144db-cd34-4e4d-8b1f-7dd4033de254/volumes" Apr 23 13:35:03.980368 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:03.980335 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:08.697288 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:08.697256 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2hdsq" Apr 23 13:35:30.808127 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:30.808089 2567 generic.go:358] "Generic (PLEG): container finished" podID="e2357249-2dd2-4507-999a-e670d1aa527f" containerID="23f81f783aee1cce8c184ae548f6c3a7bfe119b849e468ac86b52c7e15fcee19" exitCode=0 Apr 23 13:35:30.808536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:30.808163 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" event={"ID":"e2357249-2dd2-4507-999a-e670d1aa527f","Type":"ContainerDied","Data":"23f81f783aee1cce8c184ae548f6c3a7bfe119b849e468ac86b52c7e15fcee19"} Apr 23 13:35:30.808536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:30.808490 2567 scope.go:117] "RemoveContainer" containerID="23f81f783aee1cce8c184ae548f6c3a7bfe119b849e468ac86b52c7e15fcee19" Apr 23 13:35:31.812295 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:31.812260 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rpgkv" event={"ID":"e2357249-2dd2-4507-999a-e670d1aa527f","Type":"ContainerStarted","Data":"01c8273570975100918d3c38a394703ce086e9589f931ce8196c72b0bdf7dcc2"} Apr 23 13:35:53.980507 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:53.980462 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:54.004160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:54.004135 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:54.899767 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:35:54.899734 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:12.038935 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.038898 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:12.040183 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040125 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="prometheus" containerID="cri-o://7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3" gracePeriod=600 Apr 23 13:36:12.040573 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040522 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="thanos-sidecar" containerID="cri-o://1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5" gracePeriod=600 Apr 23 13:36:12.040689 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040581 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-web" containerID="cri-o://70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9" gracePeriod=600 Apr 23 13:36:12.040689 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040534 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-thanos" containerID="cri-o://aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac" gracePeriod=600 Apr 23 13:36:12.040803 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040554 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="config-reloader" containerID="cri-o://472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2" gracePeriod=600 Apr 23 13:36:12.040803 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.040387 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy" containerID="cri-o://0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d" gracePeriod=600 Apr 23 13:36:12.933388 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933349 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac" exitCode=0 Apr 23 13:36:12.933388 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933379 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d" exitCode=0 Apr 23 13:36:12.933388 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933389 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5" exitCode=0 Apr 23 13:36:12.933388 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933395 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2" exitCode=0 Apr 23 13:36:12.933388 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933400 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3" exitCode=0 Apr 23 13:36:12.933669 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933420 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac"} Apr 23 13:36:12.933669 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933455 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d"} Apr 23 13:36:12.933669 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5"} Apr 23 13:36:12.933669 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2"} Apr 23 13:36:12.933669 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:12.933482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3"} Apr 23 13:36:13.280173 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.280151 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:13.428955 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.428921 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.428964 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.428992 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429022 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429057 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429088 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429133 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429118 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429150 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429191 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429219 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429276 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429306 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429333 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429365 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429394 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429435 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429487 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429533 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zwgj\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj\") pod \"6c9a816a-e687-49b3-b624-6c95f100e264\" (UID: \"6c9a816a-e687-49b3-b624-6c95f100e264\") " Apr 23 13:36:13.429919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429536 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:13.429919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429744 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:13.429919 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.429771 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.430483 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.430260 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:13.431144 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.431114 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:13.431944 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.431890 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:13.432301 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.432254 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:13.433314 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433203 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:13.433498 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433464 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.433570 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433530 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj" (OuterVolumeSpecName: "kube-api-access-7zwgj") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "kube-api-access-7zwgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:13.433682 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433626 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out" (OuterVolumeSpecName: "config-out") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:13.433846 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433818 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.433931 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.433914 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config" (OuterVolumeSpecName: "config") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.434171 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.434150 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.434562 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.434541 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.434651 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.434562 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.434651 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.434580 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.434853 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.434827 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.443883 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.443858 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config" (OuterVolumeSpecName: "web-config") pod "6c9a816a-e687-49b3-b624-6c95f100e264" (UID: "6c9a816a-e687-49b3-b624-6c95f100e264"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530434 2567 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-metrics-client-certs\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530468 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530482 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-config-out\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530495 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-web-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530507 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530520 2567 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530532 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530534 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530547 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530561 2567 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6c9a816a-e687-49b3-b624-6c95f100e264-prometheus-k8s-db\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530573 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530585 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zwgj\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-kube-api-access-7zwgj\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530597 2567 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530610 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6c9a816a-e687-49b3-b624-6c95f100e264-tls-assets\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530622 2567 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6c9a816a-e687-49b3-b624-6c95f100e264-configmap-metrics-client-ca\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530634 2567 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-kube-rbac-proxy\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530645 2567 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.530874 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.530657 2567 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6c9a816a-e687-49b3-b624-6c95f100e264-secret-grpc-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:36:13.938854 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.938818 2567 generic.go:358] "Generic (PLEG): container finished" podID="6c9a816a-e687-49b3-b624-6c95f100e264" containerID="70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9" exitCode=0 Apr 23 13:36:13.939055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.938874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9"} Apr 23 13:36:13.939055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.938908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6c9a816a-e687-49b3-b624-6c95f100e264","Type":"ContainerDied","Data":"4326838413c6cc2dfa2ebb3c61d6d57cfdf3a8a4d57e9466064e8859adeaedf0"} Apr 23 13:36:13.939055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.938930 2567 scope.go:117] "RemoveContainer" containerID="aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac" Apr 23 13:36:13.939055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.938946 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:13.947252 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.947200 2567 scope.go:117] "RemoveContainer" containerID="0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d" Apr 23 13:36:13.953691 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.953674 2567 scope.go:117] "RemoveContainer" containerID="70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9" Apr 23 13:36:13.962389 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.962361 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:13.962603 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.962588 2567 scope.go:117] "RemoveContainer" containerID="1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5" Apr 23 13:36:13.966325 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.966301 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:13.969403 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.969384 2567 scope.go:117] "RemoveContainer" containerID="472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2" Apr 23 13:36:13.975609 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.975592 2567 scope.go:117] "RemoveContainer" containerID="7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3" Apr 23 13:36:13.981991 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.981975 2567 scope.go:117] "RemoveContainer" containerID="794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d" Apr 23 13:36:13.988298 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.988278 2567 scope.go:117] "RemoveContainer" containerID="aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac" Apr 23 13:36:13.988540 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.988521 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac\": container with ID starting with aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac not found: ID does not exist" containerID="aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac" Apr 23 13:36:13.988597 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.988551 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac"} err="failed to get container status \"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac\": rpc error: code = NotFound desc = could not find container \"aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac\": container with ID starting with aa4b53dd298235f342584ff2852bc35e3c12acc4b730a69d3739259b47a385ac not found: ID does not exist" Apr 23 13:36:13.988597 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.988572 2567 scope.go:117] "RemoveContainer" containerID="0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d" Apr 23 13:36:13.988810 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.988790 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d\": container with ID starting with 0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d not found: ID does not exist" containerID="0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d" Apr 23 13:36:13.988849 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.988817 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d"} err="failed to get container status \"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d\": rpc error: code = NotFound desc = could not find container \"0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d\": container with ID starting with 0dfb2ac8a9d1c4f428d0132feae7e7cd85d4e5823bd55b0fe88d9c3aceca738d not found: ID does not exist" Apr 23 13:36:13.988849 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.988833 2567 scope.go:117] "RemoveContainer" containerID="70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9" Apr 23 13:36:13.989044 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.989028 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9\": container with ID starting with 70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9 not found: ID does not exist" containerID="70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9" Apr 23 13:36:13.989087 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989049 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9"} err="failed to get container status \"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9\": rpc error: code = NotFound desc = could not find container \"70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9\": container with ID starting with 70533fd9e1c8186c632777624988a7235fde6c05eb129ef64d6d3f4d190fd2b9 not found: ID does not exist" Apr 23 13:36:13.989087 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989063 2567 scope.go:117] "RemoveContainer" containerID="1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5" Apr 23 13:36:13.989271 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.989254 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5\": container with ID starting with 1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5 not found: ID does not exist" containerID="1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5" Apr 23 13:36:13.989321 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989277 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5"} err="failed to get container status \"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5\": rpc error: code = NotFound desc = could not find container \"1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5\": container with ID starting with 1dc0fd13a32fabc3bdecadbfb8a2f97321a9953f435ff0fac87b9bd4d1eb72e5 not found: ID does not exist" Apr 23 13:36:13.989321 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989290 2567 scope.go:117] "RemoveContainer" containerID="472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2" Apr 23 13:36:13.989483 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.989469 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2\": container with ID starting with 472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2 not found: ID does not exist" containerID="472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2" Apr 23 13:36:13.989523 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989486 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2"} err="failed to get container status \"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2\": rpc error: code = NotFound desc = could not find container \"472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2\": container with ID starting with 472dce2142c3d962210acf3c3266b7c20e23b592a4c06161f3fbea1e901254d2 not found: ID does not exist" Apr 23 13:36:13.989523 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989498 2567 scope.go:117] "RemoveContainer" containerID="7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3" Apr 23 13:36:13.989706 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.989692 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3\": container with ID starting with 7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3 not found: ID does not exist" containerID="7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3" Apr 23 13:36:13.989745 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989708 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3"} err="failed to get container status \"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3\": rpc error: code = NotFound desc = could not find container \"7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3\": container with ID starting with 7ef6f72353341fc7fd5119a40993dac5a9a1ded2c9d0a3fd34f00ccda7d627e3 not found: ID does not exist" Apr 23 13:36:13.989745 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989720 2567 scope.go:117] "RemoveContainer" containerID="794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d" Apr 23 13:36:13.989960 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:36:13.989944 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d\": container with ID starting with 794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d not found: ID does not exist" containerID="794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d" Apr 23 13:36:13.990005 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.989963 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d"} err="failed to get container status \"794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d\": rpc error: code = NotFound desc = could not find container \"794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d\": container with ID starting with 794741b49e0f19ef4c1e86aba055aa30ef729f24e23ca917f893c3d1cb07609d not found: ID does not exist" Apr 23 13:36:13.995390 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995368 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:13.995744 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995730 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="config-reloader" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995746 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="config-reloader" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995757 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="prometheus" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995763 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="prometheus" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995771 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995777 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995787 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-web" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995793 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-web" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995799 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995806 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy" Apr 23 13:36:13.995811 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995813 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995819 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995825 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="thanos-sidecar" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995832 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="thanos-sidecar" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995838 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="init-config-reloader" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995844 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="init-config-reloader" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995888 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-web" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995896 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995901 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="kube-rbac-proxy-thanos" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995908 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="prometheus" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995915 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="config-reloader" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995921 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" containerName="thanos-sidecar" Apr 23 13:36:13.996088 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:13.995927 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c27144db-cd34-4e4d-8b1f-7dd4033de254" containerName="registry" Apr 23 13:36:14.001143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.001128 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.005750 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.005733 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:36:14.005836 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.005758 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:36:14.005891 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.005737 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:36:14.006379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.006363 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:36:14.006466 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.006440 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:36:14.007282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.007265 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:36:14.007715 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.007635 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wrfk7\"" Apr 23 13:36:14.008013 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.007802 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:36:14.008013 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.007969 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:36:14.008013 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.007987 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:36:14.008195 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.008133 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:36:14.009298 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.009285 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:36:14.012558 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.012544 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4de62imb3slip\"" Apr 23 13:36:14.032358 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.032329 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:14.038777 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.038757 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:36:14.039379 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.039355 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:36:14.135972 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.135936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.135977 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.135995 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136055 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k6b4\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-kube-api-access-4k6b4\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136103 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136153 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136156 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136175 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136287 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136325 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136354 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136678 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136678 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136518 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136678 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136549 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136678 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.136678 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.136610 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.237907 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.237817 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k6b4\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-kube-api-access-4k6b4\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.237907 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.237860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.237907 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.237878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.237907 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.237903 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238216 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238311 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238374 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238479 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238508 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.238578 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.238510 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.240611 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.240581 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.240717 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.240654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.241354 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.241330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.241449 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.241385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.241516 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.241474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.242588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.242051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config-out\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.242588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.242133 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.242588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.242197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.242588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.242326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-web-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243316 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243413 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-config\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243538 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243521 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243738 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243807 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243876 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ea6ce526-eeec-4478-8ddf-ff76d231efb6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.243959 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.243941 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ea6ce526-eeec-4478-8ddf-ff76d231efb6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.246033 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.246015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k6b4\" (UniqueName: \"kubernetes.io/projected/ea6ce526-eeec-4478-8ddf-ff76d231efb6-kube-api-access-4k6b4\") pod \"prometheus-k8s-0\" (UID: \"ea6ce526-eeec-4478-8ddf-ff76d231efb6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.311165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.311131 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:14.437218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.437188 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:36:14.440326 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:36:14.440295 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea6ce526_eeec_4478_8ddf_ff76d231efb6.slice/crio-0e43d62bb30cbf1d72393f680d126273846bfc678b2babfee04b0f0183d63e62 WatchSource:0}: Error finding container 0e43d62bb30cbf1d72393f680d126273846bfc678b2babfee04b0f0183d63e62: Status 404 returned error can't find the container with id 0e43d62bb30cbf1d72393f680d126273846bfc678b2babfee04b0f0183d63e62 Apr 23 13:36:14.943167 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.943132 2567 generic.go:358] "Generic (PLEG): container finished" podID="ea6ce526-eeec-4478-8ddf-ff76d231efb6" containerID="8e77a2a6f28e95f0f754575836e3eb80505e48ee29f1c95b8faf36586d0cb599" exitCode=0 Apr 23 13:36:14.943365 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.943215 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerDied","Data":"8e77a2a6f28e95f0f754575836e3eb80505e48ee29f1c95b8faf36586d0cb599"} Apr 23 13:36:14.943365 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:14.943286 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"0e43d62bb30cbf1d72393f680d126273846bfc678b2babfee04b0f0183d63e62"} Apr 23 13:36:15.213247 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.211804 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9a816a-e687-49b3-b624-6c95f100e264" path="/var/lib/kubelet/pods/6c9a816a-e687-49b3-b624-6c95f100e264/volumes" Apr 23 13:36:15.950741 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950701 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"c584a4bd30d10b45220016a9a5133b8157f28ec841e250987d6b0c15ab73c5b6"} Apr 23 13:36:15.950741 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950745 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"8cbf4e6308974f5ddfa3b07884de0fdcf52b1a4c1fb18fd7027685ff5fde6e29"} Apr 23 13:36:15.951168 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950757 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"05994265d986f79da3ebe1cf42dc70cb5919d77f0bc20a1c4670441642638718"} Apr 23 13:36:15.951168 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950769 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"fe451cabea27f6f85772039732bc9d59779f14a9575e20d7c80aee527c38ebd0"} Apr 23 13:36:15.951168 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950781 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"7cc004795d04377543d640e4881d70c8185eb6c7f2110d265cc43be9ef0b1325"} Apr 23 13:36:15.951168 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.950792 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ea6ce526-eeec-4478-8ddf-ff76d231efb6","Type":"ContainerStarted","Data":"597345b9f28e24ea119a80b777dfd21c424b94b44016410c659944dd1dcc528e"} Apr 23 13:36:15.977837 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:15.977782 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.977767032 podStartE2EDuration="2.977767032s" podCreationTimestamp="2026-04-23 13:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:15.976952375 +0000 UTC m=+243.359481508" watchObservedRunningTime="2026-04-23 13:36:15.977767032 +0000 UTC m=+243.360296161" Apr 23 13:36:19.312104 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:19.312070 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:25.033506 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.033464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:36:25.035750 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.035716 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a4531d-2959-43b5-929f-9d7ddf10163b-metrics-certs\") pod \"network-metrics-daemon-c246c\" (UID: \"69a4531d-2959-43b5-929f-9d7ddf10163b\") " pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:36:25.110207 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.110173 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:36:25.117786 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.117764 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c246c" Apr 23 13:36:25.234810 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.234684 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c246c"] Apr 23 13:36:25.237548 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:36:25.237519 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a4531d_2959_43b5_929f_9d7ddf10163b.slice/crio-8056c605c66fddf4895eb5a95a019df5b6acf4551507b5c03b6f71d765f8ad01 WatchSource:0}: Error finding container 8056c605c66fddf4895eb5a95a019df5b6acf4551507b5c03b6f71d765f8ad01: Status 404 returned error can't find the container with id 8056c605c66fddf4895eb5a95a019df5b6acf4551507b5c03b6f71d765f8ad01 Apr 23 13:36:25.980035 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:25.980000 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c246c" event={"ID":"69a4531d-2959-43b5-929f-9d7ddf10163b","Type":"ContainerStarted","Data":"8056c605c66fddf4895eb5a95a019df5b6acf4551507b5c03b6f71d765f8ad01"} Apr 23 13:36:26.984529 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:26.984497 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c246c" event={"ID":"69a4531d-2959-43b5-929f-9d7ddf10163b","Type":"ContainerStarted","Data":"df89c81bb8a57d4285ff252016e1512470c4e42c9a8ba16471986349f0e97d36"} Apr 23 13:36:26.984529 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:26.984536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c246c" event={"ID":"69a4531d-2959-43b5-929f-9d7ddf10163b","Type":"ContainerStarted","Data":"b1c2b3c1fbd0af72fbbce014096949ea528012e315c4c8e57c3a079e0df3a93b"} Apr 23 13:36:27.001536 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:36:27.001476 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c246c" podStartSLOduration=253.057238838 podStartE2EDuration="4m14.001457162s" podCreationTimestamp="2026-04-23 13:32:13 +0000 UTC" firstStartedPulling="2026-04-23 13:36:25.239400484 +0000 UTC m=+252.621929593" lastFinishedPulling="2026-04-23 13:36:26.18361881 +0000 UTC m=+253.566147917" observedRunningTime="2026-04-23 13:36:27.0002186 +0000 UTC m=+254.382747754" watchObservedRunningTime="2026-04-23 13:36:27.001457162 +0000 UTC m=+254.383986293" Apr 23 13:37:13.070973 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:37:13.070939 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:37:13.071563 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:37:13.071006 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:37:14.311613 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:37:14.311580 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:14.326781 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:37:14.326750 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:37:15.135426 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:37:15.135397 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:41:24.767937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.767900 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-6fwwt"] Apr 23 13:41:24.771272 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.771254 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:24.774308 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.774256 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 13:41:24.774308 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.774310 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:41:24.774512 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.774386 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-4xtp6\"" Apr 23 13:41:24.775699 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.775677 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:41:24.779924 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.779464 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6fwwt"] Apr 23 13:41:24.782587 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.782564 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqx4j\" (UniqueName: \"kubernetes.io/projected/5d7e9f42-0883-4a48-bbb7-c61087f3818d-kube-api-access-rqx4j\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:24.782684 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.782601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:24.883377 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.883340 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqx4j\" (UniqueName: \"kubernetes.io/projected/5d7e9f42-0883-4a48-bbb7-c61087f3818d-kube-api-access-rqx4j\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:24.883571 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.883395 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:24.883571 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:41:24.883541 2567 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 13:41:24.883685 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:41:24.883609 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert podName:5d7e9f42-0883-4a48-bbb7-c61087f3818d nodeName:}" failed. No retries permitted until 2026-04-23 13:41:25.383591211 +0000 UTC m=+552.766120324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert") pod "odh-model-controller-696fc77849-6fwwt" (UID: "5d7e9f42-0883-4a48-bbb7-c61087f3818d") : secret "odh-model-controller-webhook-cert" not found Apr 23 13:41:24.894059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:24.894025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqx4j\" (UniqueName: \"kubernetes.io/projected/5d7e9f42-0883-4a48-bbb7-c61087f3818d-kube-api-access-rqx4j\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:25.387720 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.387684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:25.390488 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.390463 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d7e9f42-0883-4a48-bbb7-c61087f3818d-cert\") pod \"odh-model-controller-696fc77849-6fwwt\" (UID: \"5d7e9f42-0883-4a48-bbb7-c61087f3818d\") " pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:25.683357 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.683270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:25.798443 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.798414 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-6fwwt"] Apr 23 13:41:25.801305 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:41:25.801278 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d7e9f42_0883_4a48_bbb7_c61087f3818d.slice/crio-07c3df029e3b8ca20062cbc881506304a1f95476b0d30b1ce1d5b5de04ab9b1e WatchSource:0}: Error finding container 07c3df029e3b8ca20062cbc881506304a1f95476b0d30b1ce1d5b5de04ab9b1e: Status 404 returned error can't find the container with id 07c3df029e3b8ca20062cbc881506304a1f95476b0d30b1ce1d5b5de04ab9b1e Apr 23 13:41:25.802569 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.802552 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:41:25.812186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:25.812158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6fwwt" event={"ID":"5d7e9f42-0883-4a48-bbb7-c61087f3818d","Type":"ContainerStarted","Data":"07c3df029e3b8ca20062cbc881506304a1f95476b0d30b1ce1d5b5de04ab9b1e"} Apr 23 13:41:28.822568 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:28.822521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-6fwwt" event={"ID":"5d7e9f42-0883-4a48-bbb7-c61087f3818d","Type":"ContainerStarted","Data":"5b19fbb974e543f310b1dc92628cdf32774b44830254d482fda9907e5109c3de"} Apr 23 13:41:28.823004 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:28.822651 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:28.839605 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:28.839549 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-6fwwt" podStartSLOduration=2.376439447 podStartE2EDuration="4.83953218s" podCreationTimestamp="2026-04-23 13:41:24 +0000 UTC" firstStartedPulling="2026-04-23 13:41:25.802668748 +0000 UTC m=+553.185197867" lastFinishedPulling="2026-04-23 13:41:28.265761477 +0000 UTC m=+555.648290600" observedRunningTime="2026-04-23 13:41:28.838752981 +0000 UTC m=+556.221282112" watchObservedRunningTime="2026-04-23 13:41:28.83953218 +0000 UTC m=+556.222061310" Apr 23 13:41:39.828393 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:39.828365 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-6fwwt" Apr 23 13:41:40.681763 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.681729 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-6tb7v"] Apr 23 13:41:40.685101 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.685085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6tb7v" Apr 23 13:41:40.688262 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.688241 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:41:40.688262 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.688259 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-brc9m\"" Apr 23 13:41:40.692756 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.692735 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6tb7v"] Apr 23 13:41:40.697089 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.697072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhb6g\" (UniqueName: \"kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g\") pod \"s3-init-6tb7v\" (UID: \"11ac586b-5bb2-482e-b559-f2f6186d7d8e\") " pod="kserve/s3-init-6tb7v" Apr 23 13:41:40.798352 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.798314 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhb6g\" (UniqueName: \"kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g\") pod \"s3-init-6tb7v\" (UID: \"11ac586b-5bb2-482e-b559-f2f6186d7d8e\") " pod="kserve/s3-init-6tb7v" Apr 23 13:41:40.807823 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:40.807797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhb6g\" (UniqueName: \"kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g\") pod \"s3-init-6tb7v\" (UID: \"11ac586b-5bb2-482e-b559-f2f6186d7d8e\") " pod="kserve/s3-init-6tb7v" Apr 23 13:41:41.006739 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:41.006650 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6tb7v" Apr 23 13:41:41.125702 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:41.125678 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6tb7v"] Apr 23 13:41:41.128379 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:41:41.128339 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11ac586b_5bb2_482e_b559_f2f6186d7d8e.slice/crio-4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303 WatchSource:0}: Error finding container 4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303: Status 404 returned error can't find the container with id 4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303 Apr 23 13:41:41.860118 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:41.860078 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6tb7v" event={"ID":"11ac586b-5bb2-482e-b559-f2f6186d7d8e","Type":"ContainerStarted","Data":"4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303"} Apr 23 13:41:45.874254 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:45.874203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6tb7v" event={"ID":"11ac586b-5bb2-482e-b559-f2f6186d7d8e","Type":"ContainerStarted","Data":"bc75ed52d0317f6159ac7f197688029aa9c1997e73824ca2d08c5ed71982043b"} Apr 23 13:41:45.892096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:45.892038 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-6tb7v" podStartSLOduration=1.441047008 podStartE2EDuration="5.892023862s" podCreationTimestamp="2026-04-23 13:41:40 +0000 UTC" firstStartedPulling="2026-04-23 13:41:41.130545757 +0000 UTC m=+568.513074866" lastFinishedPulling="2026-04-23 13:41:45.581522607 +0000 UTC m=+572.964051720" observedRunningTime="2026-04-23 13:41:45.890494984 +0000 UTC m=+573.273024115" watchObservedRunningTime="2026-04-23 13:41:45.892023862 +0000 UTC m=+573.274552991" Apr 23 13:41:48.884621 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:48.884587 2567 generic.go:358] "Generic (PLEG): container finished" podID="11ac586b-5bb2-482e-b559-f2f6186d7d8e" containerID="bc75ed52d0317f6159ac7f197688029aa9c1997e73824ca2d08c5ed71982043b" exitCode=0 Apr 23 13:41:48.884990 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:48.884638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6tb7v" event={"ID":"11ac586b-5bb2-482e-b559-f2f6186d7d8e","Type":"ContainerDied","Data":"bc75ed52d0317f6159ac7f197688029aa9c1997e73824ca2d08c5ed71982043b"} Apr 23 13:41:50.008933 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.008908 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6tb7v" Apr 23 13:41:50.075892 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.075849 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhb6g\" (UniqueName: \"kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g\") pod \"11ac586b-5bb2-482e-b559-f2f6186d7d8e\" (UID: \"11ac586b-5bb2-482e-b559-f2f6186d7d8e\") " Apr 23 13:41:50.077994 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.077962 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g" (OuterVolumeSpecName: "kube-api-access-dhb6g") pod "11ac586b-5bb2-482e-b559-f2f6186d7d8e" (UID: "11ac586b-5bb2-482e-b559-f2f6186d7d8e"). InnerVolumeSpecName "kube-api-access-dhb6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:41:50.177344 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.177306 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhb6g\" (UniqueName: \"kubernetes.io/projected/11ac586b-5bb2-482e-b559-f2f6186d7d8e-kube-api-access-dhb6g\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:41:50.891982 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.891944 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6tb7v" event={"ID":"11ac586b-5bb2-482e-b559-f2f6186d7d8e","Type":"ContainerDied","Data":"4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303"} Apr 23 13:41:50.891982 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.891965 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6tb7v" Apr 23 13:41:50.891982 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:41:50.891983 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3b773a4ac3edf09717d2235682bc13ed82bd6e6ded2f8c9c773405bd98d303" Apr 23 13:42:00.049582 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.049541 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:42:00.050041 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.049859 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11ac586b-5bb2-482e-b559-f2f6186d7d8e" containerName="s3-init" Apr 23 13:42:00.050041 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.049870 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ac586b-5bb2-482e-b559-f2f6186d7d8e" containerName="s3-init" Apr 23 13:42:00.050041 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.049940 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="11ac586b-5bb2-482e-b559-f2f6186d7d8e" containerName="s3-init" Apr 23 13:42:00.052050 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.052029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.055001 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.054977 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-94d25-predictor-serving-cert\"" Apr 23 13:42:00.055122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.055015 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:42:00.055122 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.055015 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-94d25-kube-rbac-proxy-sar-config\"" Apr 23 13:42:00.056384 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.056368 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tk4rw\"" Apr 23 13:42:00.056384 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.056379 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:42:00.062917 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.062885 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:42:00.157282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.157244 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xjr\" (UniqueName: \"kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.157434 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.157304 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.157476 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.157438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.258599 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.258555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.258788 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.258674 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xjr\" (UniqueName: \"kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.258855 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.258839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.259025 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:42:00.259005 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-94d25-predictor-serving-cert: secret "success-200-isvc-94d25-predictor-serving-cert" not found Apr 23 13:42:00.259257 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:42:00.259246 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls podName:74a8dd96-1bab-4a96-836c-3451008c1453 nodeName:}" failed. No retries permitted until 2026-04-23 13:42:00.759203555 +0000 UTC m=+588.141732667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls") pod "success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" (UID: "74a8dd96-1bab-4a96-836c-3451008c1453") : secret "success-200-isvc-94d25-predictor-serving-cert" not found Apr 23 13:42:00.259468 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.259440 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.268480 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.268447 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xjr\" (UniqueName: \"kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.763182 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.763152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.765667 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.765634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") pod \"success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:00.965440 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:00.965403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:01.087107 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.087082 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:42:01.089195 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:42:01.089167 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a8dd96_1bab_4a96_836c_3451008c1453.slice/crio-61d73ff86bbcb7daefc36a4dc3590b65cd92193dae7d0eeb911034eb1b55a4ac WatchSource:0}: Error finding container 61d73ff86bbcb7daefc36a4dc3590b65cd92193dae7d0eeb911034eb1b55a4ac: Status 404 returned error can't find the container with id 61d73ff86bbcb7daefc36a4dc3590b65cd92193dae7d0eeb911034eb1b55a4ac Apr 23 13:42:01.242036 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.242005 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:42:01.245214 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.245191 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.248012 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.247988 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 23 13:42:01.248105 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.247988 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 23 13:42:01.253822 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.253741 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:42:01.368835 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.368800 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.368991 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.368863 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhkk\" (UniqueName: \"kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.368991 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.368915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.369066 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.368987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.469602 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.469563 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhkk\" (UniqueName: \"kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.469774 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.469626 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.469774 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.469654 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.469774 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.469683 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.470097 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.470072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.470511 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.470487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.472058 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.472039 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.478565 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.478544 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhkk\" (UniqueName: \"kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk\") pod \"isvc-sklearn-graph-2-predictor-565675b447-t5g7p\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.557440 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.557380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:01.717394 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.717330 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:42:01.723530 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:42:01.723489 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6d11ee3_1e59_43fa_8a0e_20c272188933.slice/crio-bab281ab14227189f22988ac4ebf3b1fce4a8a0d35be4e909ccb1fb3406e011b WatchSource:0}: Error finding container bab281ab14227189f22988ac4ebf3b1fce4a8a0d35be4e909ccb1fb3406e011b: Status 404 returned error can't find the container with id bab281ab14227189f22988ac4ebf3b1fce4a8a0d35be4e909ccb1fb3406e011b Apr 23 13:42:01.926336 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.926207 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerStarted","Data":"61d73ff86bbcb7daefc36a4dc3590b65cd92193dae7d0eeb911034eb1b55a4ac"} Apr 23 13:42:01.928042 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:01.927993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerStarted","Data":"bab281ab14227189f22988ac4ebf3b1fce4a8a0d35be4e909ccb1fb3406e011b"} Apr 23 13:42:15.353347 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:15.353321 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:42:15.357412 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:15.357388 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:42:15.982360 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:15.982322 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerStarted","Data":"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9"} Apr 23 13:42:15.984143 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:15.984089 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerStarted","Data":"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904"} Apr 23 13:42:17.991393 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:17.991345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerStarted","Data":"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908"} Apr 23 13:42:17.991860 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:17.991586 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:18.010307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:18.010252 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podStartSLOduration=1.283748942 podStartE2EDuration="18.010221828s" podCreationTimestamp="2026-04-23 13:42:00 +0000 UTC" firstStartedPulling="2026-04-23 13:42:01.091109469 +0000 UTC m=+588.473638577" lastFinishedPulling="2026-04-23 13:42:17.817582355 +0000 UTC m=+605.200111463" observedRunningTime="2026-04-23 13:42:18.007936683 +0000 UTC m=+605.390465813" watchObservedRunningTime="2026-04-23 13:42:18.010221828 +0000 UTC m=+605.392750957" Apr 23 13:42:18.994508 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:18.994465 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:18.995748 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:18.995721 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:42:19.998167 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:19.998127 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerID="24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904" exitCode=0 Apr 23 13:42:19.998645 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:19.998208 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerDied","Data":"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904"} Apr 23 13:42:19.998723 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:19.998657 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:42:25.004835 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:25.004797 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:42:25.005458 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:25.005429 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:42:26.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:26.020794 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerStarted","Data":"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9"} Apr 23 13:42:26.020885 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:26.020842 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerStarted","Data":"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d"} Apr 23 13:42:26.021305 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:26.021053 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:26.041301 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:26.041246 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podStartSLOduration=1.024295075 podStartE2EDuration="25.041217466s" podCreationTimestamp="2026-04-23 13:42:01 +0000 UTC" firstStartedPulling="2026-04-23 13:42:01.725864688 +0000 UTC m=+589.108393803" lastFinishedPulling="2026-04-23 13:42:25.742787086 +0000 UTC m=+613.125316194" observedRunningTime="2026-04-23 13:42:26.0388216 +0000 UTC m=+613.421350731" watchObservedRunningTime="2026-04-23 13:42:26.041217466 +0000 UTC m=+613.423746595" Apr 23 13:42:27.024096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:27.024060 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:27.025609 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:27.025577 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:42:28.026954 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:28.026913 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:42:33.031467 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:33.031434 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:42:33.032108 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:33.032080 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:42:35.005692 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:35.005650 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:42:43.031997 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:43.031953 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:42:45.005687 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:45.005648 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:42:53.032816 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:53.032773 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:42:55.006044 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:42:55.005997 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:43:03.032429 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:03.032380 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:43:05.006294 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:05.006265 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:43:13.032481 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:13.032440 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:43:23.032826 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:23.032782 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:43:33.032390 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:33.032355 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:43:34.473320 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.473286 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:43:34.473861 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.473710 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" containerID="cri-o://b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9" gracePeriod=30 Apr 23 13:43:34.473861 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.473768 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kube-rbac-proxy" containerID="cri-o://4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908" gracePeriod=30 Apr 23 13:43:34.518894 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.518854 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:43:34.522051 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.522029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.524940 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.524912 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\"" Apr 23 13:43:34.525172 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.525154 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d2d70-predictor-serving-cert\"" Apr 23 13:43:34.530110 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.530085 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:43:34.652846 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.652802 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rk7t\" (UniqueName: \"kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.653015 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.652866 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.653015 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.652974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.753929 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.753837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rk7t\" (UniqueName: \"kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.753929 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.753910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.754110 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.753958 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.754724 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.754697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.756371 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.756351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.762461 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.762442 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rk7t\" (UniqueName: \"kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t\") pod \"success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.834176 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.834134 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:34.970690 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.970664 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:43:34.972880 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:43:34.972852 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33060c2f_cf52_4fb3_a6d2_b785274a7767.slice/crio-2fce58b0958f23a9113fe5732a5fe9d81d203d9a74e9769b285237acceed8123 WatchSource:0}: Error finding container 2fce58b0958f23a9113fe5732a5fe9d81d203d9a74e9769b285237acceed8123: Status 404 returned error can't find the container with id 2fce58b0958f23a9113fe5732a5fe9d81d203d9a74e9769b285237acceed8123 Apr 23 13:43:34.999028 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:34.998998 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.23:8643/healthz\": dial tcp 10.133.0.23:8643: connect: connection refused" Apr 23 13:43:35.006066 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.006034 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 23 13:43:35.233175 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.233135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerStarted","Data":"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021"} Apr 23 13:43:35.233399 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.233182 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerStarted","Data":"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265"} Apr 23 13:43:35.233399 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.233198 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerStarted","Data":"2fce58b0958f23a9113fe5732a5fe9d81d203d9a74e9769b285237acceed8123"} Apr 23 13:43:35.233399 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.233263 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:35.234822 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.234797 2567 generic.go:358] "Generic (PLEG): container finished" podID="74a8dd96-1bab-4a96-836c-3451008c1453" containerID="4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908" exitCode=2 Apr 23 13:43:35.234946 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.234850 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerDied","Data":"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908"} Apr 23 13:43:35.251542 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:35.251478 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podStartSLOduration=1.251464198 podStartE2EDuration="1.251464198s" podCreationTimestamp="2026-04-23 13:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:43:35.250011512 +0000 UTC m=+682.632540644" watchObservedRunningTime="2026-04-23 13:43:35.251464198 +0000 UTC m=+682.633993327" Apr 23 13:43:36.238084 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:36.238045 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:36.239251 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:36.239213 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:43:37.241185 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:37.241149 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:43:38.009118 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.009092 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:43:38.184186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.184147 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xjr\" (UniqueName: \"kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr\") pod \"74a8dd96-1bab-4a96-836c-3451008c1453\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " Apr 23 13:43:38.184371 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.184287 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") pod \"74a8dd96-1bab-4a96-836c-3451008c1453\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " Apr 23 13:43:38.184371 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.184308 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config\") pod \"74a8dd96-1bab-4a96-836c-3451008c1453\" (UID: \"74a8dd96-1bab-4a96-836c-3451008c1453\") " Apr 23 13:43:38.184719 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.184692 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-94d25-kube-rbac-proxy-sar-config") pod "74a8dd96-1bab-4a96-836c-3451008c1453" (UID: "74a8dd96-1bab-4a96-836c-3451008c1453"). InnerVolumeSpecName "success-200-isvc-94d25-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:43:38.186362 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.186334 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "74a8dd96-1bab-4a96-836c-3451008c1453" (UID: "74a8dd96-1bab-4a96-836c-3451008c1453"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:38.186461 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.186393 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr" (OuterVolumeSpecName: "kube-api-access-r6xjr") pod "74a8dd96-1bab-4a96-836c-3451008c1453" (UID: "74a8dd96-1bab-4a96-836c-3451008c1453"). InnerVolumeSpecName "kube-api-access-r6xjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:38.246512 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.246477 2567 generic.go:358] "Generic (PLEG): container finished" podID="74a8dd96-1bab-4a96-836c-3451008c1453" containerID="b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9" exitCode=0 Apr 23 13:43:38.246890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.246558 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerDied","Data":"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9"} Apr 23 13:43:38.246890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.246599 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" event={"ID":"74a8dd96-1bab-4a96-836c-3451008c1453","Type":"ContainerDied","Data":"61d73ff86bbcb7daefc36a4dc3590b65cd92193dae7d0eeb911034eb1b55a4ac"} Apr 23 13:43:38.246890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.246614 2567 scope.go:117] "RemoveContainer" containerID="4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908" Apr 23 13:43:38.246890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.246570 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9" Apr 23 13:43:38.254717 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.254696 2567 scope.go:117] "RemoveContainer" containerID="b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9" Apr 23 13:43:38.262040 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.262025 2567 scope.go:117] "RemoveContainer" containerID="4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908" Apr 23 13:43:38.262307 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:43:38.262282 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908\": container with ID starting with 4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908 not found: ID does not exist" containerID="4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908" Apr 23 13:43:38.262400 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.262320 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908"} err="failed to get container status \"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908\": rpc error: code = NotFound desc = could not find container \"4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908\": container with ID starting with 4b024f579364deba8b04451057f778e18302541a199c2cdc50a4ca4d08343908 not found: ID does not exist" Apr 23 13:43:38.262400 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.262345 2567 scope.go:117] "RemoveContainer" containerID="b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9" Apr 23 13:43:38.262584 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:43:38.262567 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9\": container with ID starting with b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9 not found: ID does not exist" containerID="b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9" Apr 23 13:43:38.262623 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.262591 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9"} err="failed to get container status \"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9\": rpc error: code = NotFound desc = could not find container \"b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9\": container with ID starting with b79bfc846efa598f76c5a291cefd3d28d2ebbf84a7371e4ba383713c05efe5c9 not found: ID does not exist" Apr 23 13:43:38.269816 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.269792 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:43:38.273858 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.273832 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-94d25-predictor-6f676d4cb5-jjsc9"] Apr 23 13:43:38.285552 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.285532 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6xjr\" (UniqueName: \"kubernetes.io/projected/74a8dd96-1bab-4a96-836c-3451008c1453-kube-api-access-r6xjr\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:43:38.285653 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.285555 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74a8dd96-1bab-4a96-836c-3451008c1453-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:43:38.285653 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:38.285568 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-94d25-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/74a8dd96-1bab-4a96-836c-3451008c1453-success-200-isvc-94d25-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:43:39.210700 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:39.210671 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" path="/var/lib/kubelet/pods/74a8dd96-1bab-4a96-836c-3451008c1453/volumes" Apr 23 13:43:42.245415 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:42.245387 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:43:42.245962 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:42.245934 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:43:52.246748 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:43:52.246704 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:44:02.246073 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:02.246030 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:44:10.314389 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.314351 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:44:10.314820 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.314790 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" containerID="cri-o://a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d" gracePeriod=30 Apr 23 13:44:10.314898 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.314808 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kube-rbac-proxy" containerID="cri-o://fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9" gracePeriod=30 Apr 23 13:44:10.390278 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390241 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:44:10.390635 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390621 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" Apr 23 13:44:10.390694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390638 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" Apr 23 13:44:10.390694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390658 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kube-rbac-proxy" Apr 23 13:44:10.390694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390664 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kube-rbac-proxy" Apr 23 13:44:10.390852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390713 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kube-rbac-proxy" Apr 23 13:44:10.390852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.390724 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="74a8dd96-1bab-4a96-836c-3451008c1453" containerName="kserve-container" Apr 23 13:44:10.393180 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.393156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.396382 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.396356 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f7bdd-predictor-serving-cert\"" Apr 23 13:44:10.396504 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.396361 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\"" Apr 23 13:44:10.402846 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.402817 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:44:10.437934 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.437746 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4vwm\" (UniqueName: \"kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.437934 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.437813 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.437934 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.437846 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.538290 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.538255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4vwm\" (UniqueName: \"kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.538459 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.538298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.538459 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.538326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.538459 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:44:10.538425 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-serving-cert: secret "success-200-isvc-f7bdd-predictor-serving-cert" not found Apr 23 13:44:10.538622 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:44:10.538504 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls podName:feee8d88-99ee-4507-8190-0cb66713e21d nodeName:}" failed. No retries permitted until 2026-04-23 13:44:11.038481559 +0000 UTC m=+718.421010680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls") pod "success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" (UID: "feee8d88-99ee-4507-8190-0cb66713e21d") : secret "success-200-isvc-f7bdd-predictor-serving-cert" not found Apr 23 13:44:10.538968 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.538949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:10.546828 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:10.546798 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4vwm\" (UniqueName: \"kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:11.042789 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.042747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:11.046163 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.046127 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") pod \"success-200-isvc-f7bdd-predictor-54969f468c-dq4nt\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:11.306595 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.306494 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:11.347243 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.347199 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerID="fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9" exitCode=2 Apr 23 13:44:11.347585 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.347256 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerDied","Data":"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9"} Apr 23 13:44:11.429123 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:11.428806 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:44:11.432599 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:44:11.432553 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeee8d88_99ee_4507_8190_0cb66713e21d.slice/crio-2a01a73b75047a39636f4db28e92c992ec9292a2bcc67994ae4369b65542ff4f WatchSource:0}: Error finding container 2a01a73b75047a39636f4db28e92c992ec9292a2bcc67994ae4369b65542ff4f: Status 404 returned error can't find the container with id 2a01a73b75047a39636f4db28e92c992ec9292a2bcc67994ae4369b65542ff4f Apr 23 13:44:12.246485 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.246447 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:44:12.351577 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.351541 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerStarted","Data":"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a"} Apr 23 13:44:12.351577 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.351580 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerStarted","Data":"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8"} Apr 23 13:44:12.351988 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.351596 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerStarted","Data":"2a01a73b75047a39636f4db28e92c992ec9292a2bcc67994ae4369b65542ff4f"} Apr 23 13:44:12.351988 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.351668 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:12.370288 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:12.370220 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podStartSLOduration=2.37020403 podStartE2EDuration="2.37020403s" podCreationTimestamp="2026-04-23 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:44:12.3679599 +0000 UTC m=+719.750489029" watchObservedRunningTime="2026-04-23 13:44:12.37020403 +0000 UTC m=+719.752733160" Apr 23 13:44:13.027573 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:13.027518 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.24:8643/healthz\": dial tcp 10.133.0.24:8643: connect: connection refused" Apr 23 13:44:13.032992 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:13.032943 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 13:44:13.355129 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:13.355088 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:13.356580 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:13.356555 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:14.357799 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:14.357762 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:15.064971 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.064947 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:44:15.178046 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178013 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls\") pod \"a6d11ee3-1e59-43fa-8a0e-20c272188933\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " Apr 23 13:44:15.178046 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178049 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location\") pod \"a6d11ee3-1e59-43fa-8a0e-20c272188933\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " Apr 23 13:44:15.178302 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178082 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"a6d11ee3-1e59-43fa-8a0e-20c272188933\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " Apr 23 13:44:15.178302 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178127 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhkk\" (UniqueName: \"kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk\") pod \"a6d11ee3-1e59-43fa-8a0e-20c272188933\" (UID: \"a6d11ee3-1e59-43fa-8a0e-20c272188933\") " Apr 23 13:44:15.178428 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178388 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a6d11ee3-1e59-43fa-8a0e-20c272188933" (UID: "a6d11ee3-1e59-43fa-8a0e-20c272188933"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:15.178542 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.178517 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "a6d11ee3-1e59-43fa-8a0e-20c272188933" (UID: "a6d11ee3-1e59-43fa-8a0e-20c272188933"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:15.180052 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.180029 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a6d11ee3-1e59-43fa-8a0e-20c272188933" (UID: "a6d11ee3-1e59-43fa-8a0e-20c272188933"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:15.180117 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.180094 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk" (OuterVolumeSpecName: "kube-api-access-fnhkk") pod "a6d11ee3-1e59-43fa-8a0e-20c272188933" (UID: "a6d11ee3-1e59-43fa-8a0e-20c272188933"). InnerVolumeSpecName "kube-api-access-fnhkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:15.279003 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.278964 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a6d11ee3-1e59-43fa-8a0e-20c272188933-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:44:15.279003 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.278997 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fnhkk\" (UniqueName: \"kubernetes.io/projected/a6d11ee3-1e59-43fa-8a0e-20c272188933-kube-api-access-fnhkk\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:44:15.279003 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.279007 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a6d11ee3-1e59-43fa-8a0e-20c272188933-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:44:15.279261 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.279018 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a6d11ee3-1e59-43fa-8a0e-20c272188933-kserve-provision-location\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:44:15.362492 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.362453 2567 generic.go:358] "Generic (PLEG): container finished" podID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerID="a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d" exitCode=0 Apr 23 13:44:15.362869 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.362544 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" Apr 23 13:44:15.362869 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.362542 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerDied","Data":"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d"} Apr 23 13:44:15.362869 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.362645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p" event={"ID":"a6d11ee3-1e59-43fa-8a0e-20c272188933","Type":"ContainerDied","Data":"bab281ab14227189f22988ac4ebf3b1fce4a8a0d35be4e909ccb1fb3406e011b"} Apr 23 13:44:15.362869 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.362660 2567 scope.go:117] "RemoveContainer" containerID="fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9" Apr 23 13:44:15.370728 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.370710 2567 scope.go:117] "RemoveContainer" containerID="a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d" Apr 23 13:44:15.377720 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.377703 2567 scope.go:117] "RemoveContainer" containerID="24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904" Apr 23 13:44:15.380598 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.380576 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:44:15.384498 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.384476 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-565675b447-t5g7p"] Apr 23 13:44:15.385448 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.385436 2567 scope.go:117] "RemoveContainer" containerID="fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9" Apr 23 13:44:15.385710 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:44:15.385691 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9\": container with ID starting with fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9 not found: ID does not exist" containerID="fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9" Apr 23 13:44:15.385778 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.385719 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9"} err="failed to get container status \"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9\": rpc error: code = NotFound desc = could not find container \"fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9\": container with ID starting with fb4d441f6f92c0c42cf89d4c6b6aba2dfaaa27bd72b360e589892d9f6ca04fa9 not found: ID does not exist" Apr 23 13:44:15.385778 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.385736 2567 scope.go:117] "RemoveContainer" containerID="a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d" Apr 23 13:44:15.385936 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:44:15.385922 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d\": container with ID starting with a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d not found: ID does not exist" containerID="a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d" Apr 23 13:44:15.385974 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.385940 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d"} err="failed to get container status \"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d\": rpc error: code = NotFound desc = could not find container \"a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d\": container with ID starting with a28646b82f08feb35b741ec87683d25ffc7d50293f0ad900857cc90fe3e35c4d not found: ID does not exist" Apr 23 13:44:15.385974 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.385952 2567 scope.go:117] "RemoveContainer" containerID="24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904" Apr 23 13:44:15.386155 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:44:15.386138 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904\": container with ID starting with 24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904 not found: ID does not exist" containerID="24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904" Apr 23 13:44:15.386193 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:15.386162 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904"} err="failed to get container status \"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904\": rpc error: code = NotFound desc = could not find container \"24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904\": container with ID starting with 24195febd38deee97094f5cc230ca38b011d652d19bbdb26243c594321117904 not found: ID does not exist" Apr 23 13:44:17.210287 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:17.210252 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" path="/var/lib/kubelet/pods/a6d11ee3-1e59-43fa-8a0e-20c272188933/volumes" Apr 23 13:44:19.362039 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:19.362007 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:44:19.362537 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:19.362509 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:22.247120 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:22.247081 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:44:29.362875 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:29.362833 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:39.362802 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:39.362763 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:49.362635 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:49.362597 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 13:44:59.363414 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:44:59.363384 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:47:15.380552 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:47:15.380520 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:47:15.382093 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:47:15.381997 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:52:15.401871 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:15.401756 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:52:15.404038 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:15.403897 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:52:49.408745 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.408710 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:52:49.409347 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.409005 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" containerID="cri-o://203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265" gracePeriod=30 Apr 23 13:52:49.409347 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.409031 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kube-rbac-proxy" containerID="cri-o://2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021" gracePeriod=30 Apr 23 13:52:49.494127 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494089 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:52:49.494444 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494431 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" Apr 23 13:52:49.494501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494446 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" Apr 23 13:52:49.494501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494469 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="storage-initializer" Apr 23 13:52:49.494501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494475 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="storage-initializer" Apr 23 13:52:49.494501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494481 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kube-rbac-proxy" Apr 23 13:52:49.494501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494487 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kube-rbac-proxy" Apr 23 13:52:49.494702 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494558 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kserve-container" Apr 23 13:52:49.494702 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.494568 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6d11ee3-1e59-43fa-8a0e-20c272188933" containerName="kube-rbac-proxy" Apr 23 13:52:49.497697 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.497675 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.500714 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.500691 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2d7b2-predictor-serving-cert\"" Apr 23 13:52:49.500824 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.500767 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\"" Apr 23 13:52:49.517768 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.517728 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:52:49.529443 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.529410 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.529604 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.529469 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzwtx\" (UniqueName: \"kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.529604 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.529506 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.631033 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.630989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.631282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.631045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzwtx\" (UniqueName: \"kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.631282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.631080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.631282 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:52:49.631161 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-serving-cert: secret "success-200-isvc-2d7b2-predictor-serving-cert" not found Apr 23 13:52:49.631282 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:52:49.631255 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls podName:392cb698-8a42-421c-bc37-5884e4f47df1 nodeName:}" failed. No retries permitted until 2026-04-23 13:52:50.13121856 +0000 UTC m=+1237.513747692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls") pod "success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" (UID: "392cb698-8a42-421c-bc37-5884e4f47df1") : secret "success-200-isvc-2d7b2-predictor-serving-cert" not found Apr 23 13:52:49.631793 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.631769 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.639905 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.639882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzwtx\" (UniqueName: \"kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:49.881331 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.881294 2567 generic.go:358] "Generic (PLEG): container finished" podID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerID="2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021" exitCode=2 Apr 23 13:52:49.881503 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:49.881364 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerDied","Data":"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021"} Apr 23 13:52:50.135540 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.135458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:50.137878 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.137852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") pod \"success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:50.407713 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.407613 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:50.530059 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.529957 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:52:50.533484 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:52:50.533448 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392cb698_8a42_421c_bc37_5884e4f47df1.slice/crio-1bbf4f5688d46556c3d55edf81f83f87c22d8feb2a555bd505bf9aafb737de70 WatchSource:0}: Error finding container 1bbf4f5688d46556c3d55edf81f83f87c22d8feb2a555bd505bf9aafb737de70: Status 404 returned error can't find the container with id 1bbf4f5688d46556c3d55edf81f83f87c22d8feb2a555bd505bf9aafb737de70 Apr 23 13:52:50.535390 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.535368 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:52:50.885846 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.885808 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerStarted","Data":"c0560c69d0e18ae51ad7f27faa752d8286ba3202d833460646c3aeaf54794c88"} Apr 23 13:52:50.885846 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.885848 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerStarted","Data":"2a600223262e09935b0a5a0d82f27287422b9c0160c900237536a8c6ff0b5ae5"} Apr 23 13:52:50.886100 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.885859 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerStarted","Data":"1bbf4f5688d46556c3d55edf81f83f87c22d8feb2a555bd505bf9aafb737de70"} Apr 23 13:52:50.886100 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.885917 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:50.905724 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:50.905681 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podStartSLOduration=1.90566535 podStartE2EDuration="1.90566535s" podCreationTimestamp="2026-04-23 13:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:52:50.903410602 +0000 UTC m=+1238.285939733" watchObservedRunningTime="2026-04-23 13:52:50.90566535 +0000 UTC m=+1238.288194536" Apr 23 13:52:51.888635 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:51.888605 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:51.889960 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:51.889933 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:52:52.242505 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.242412 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.25:8643/healthz\": dial tcp 10.133.0.25:8643: connect: connection refused" Apr 23 13:52:52.246759 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.246735 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 23 13:52:52.756791 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.756764 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:52:52.860033 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.860003 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls\") pod \"33060c2f-cf52-4fb3-a6d2-b785274a7767\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " Apr 23 13:52:52.860186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.860056 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config\") pod \"33060c2f-cf52-4fb3-a6d2-b785274a7767\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " Apr 23 13:52:52.860186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.860130 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rk7t\" (UniqueName: \"kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t\") pod \"33060c2f-cf52-4fb3-a6d2-b785274a7767\" (UID: \"33060c2f-cf52-4fb3-a6d2-b785274a7767\") " Apr 23 13:52:52.860470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.860439 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d2d70-kube-rbac-proxy-sar-config") pod "33060c2f-cf52-4fb3-a6d2-b785274a7767" (UID: "33060c2f-cf52-4fb3-a6d2-b785274a7767"). InnerVolumeSpecName "success-200-isvc-d2d70-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:52:52.862155 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.862129 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33060c2f-cf52-4fb3-a6d2-b785274a7767" (UID: "33060c2f-cf52-4fb3-a6d2-b785274a7767"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:52:52.862215 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.862167 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t" (OuterVolumeSpecName: "kube-api-access-9rk7t") pod "33060c2f-cf52-4fb3-a6d2-b785274a7767" (UID: "33060c2f-cf52-4fb3-a6d2-b785274a7767"). InnerVolumeSpecName "kube-api-access-9rk7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:52:52.892890 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.892857 2567 generic.go:358] "Generic (PLEG): container finished" podID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerID="203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265" exitCode=0 Apr 23 13:52:52.893291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.892933 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" Apr 23 13:52:52.893291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.892946 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerDied","Data":"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265"} Apr 23 13:52:52.893291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.892983 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2" event={"ID":"33060c2f-cf52-4fb3-a6d2-b785274a7767","Type":"ContainerDied","Data":"2fce58b0958f23a9113fe5732a5fe9d81d203d9a74e9769b285237acceed8123"} Apr 23 13:52:52.893291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.893005 2567 scope.go:117] "RemoveContainer" containerID="2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021" Apr 23 13:52:52.893651 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.893606 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:52:52.901902 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.901885 2567 scope.go:117] "RemoveContainer" containerID="203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265" Apr 23 13:52:52.909018 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.908997 2567 scope.go:117] "RemoveContainer" containerID="2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021" Apr 23 13:52:52.909278 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:52:52.909255 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021\": container with ID starting with 2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021 not found: ID does not exist" containerID="2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021" Apr 23 13:52:52.909318 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.909291 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021"} err="failed to get container status \"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021\": rpc error: code = NotFound desc = could not find container \"2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021\": container with ID starting with 2762b776c822ea41238996dc48e603ed792c1d84ff86027d32ad702076808021 not found: ID does not exist" Apr 23 13:52:52.909362 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.909318 2567 scope.go:117] "RemoveContainer" containerID="203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265" Apr 23 13:52:52.909538 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:52:52.909521 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265\": container with ID starting with 203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265 not found: ID does not exist" containerID="203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265" Apr 23 13:52:52.909577 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.909545 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265"} err="failed to get container status \"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265\": rpc error: code = NotFound desc = could not find container \"203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265\": container with ID starting with 203fdc60f073f8e66e84505ee07c7e431903c406b01f7d98f86c93de26502265 not found: ID does not exist" Apr 23 13:52:52.914880 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.914855 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:52:52.916839 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.916815 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d2d70-predictor-5cd95cc7dc-9qgt2"] Apr 23 13:52:52.961100 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.961065 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9rk7t\" (UniqueName: \"kubernetes.io/projected/33060c2f-cf52-4fb3-a6d2-b785274a7767-kube-api-access-9rk7t\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:52:52.961100 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.961092 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33060c2f-cf52-4fb3-a6d2-b785274a7767-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:52:52.961100 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:52.961104 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d2d70-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33060c2f-cf52-4fb3-a6d2-b785274a7767-success-200-isvc-d2d70-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:52:53.210402 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:53.210318 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" path="/var/lib/kubelet/pods/33060c2f-cf52-4fb3-a6d2-b785274a7767/volumes" Apr 23 13:52:57.897835 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:57.897806 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:52:57.898415 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:52:57.898364 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:53:07.898440 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:07.898394 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:53:17.899138 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:17.899079 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:53:25.237934 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.237897 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:53:25.238368 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.238167 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" containerID="cri-o://73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8" gracePeriod=30 Apr 23 13:53:25.238368 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.238218 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kube-rbac-proxy" containerID="cri-o://3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a" gracePeriod=30 Apr 23 13:53:25.269591 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.269562 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:53:25.270047 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270033 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kube-rbac-proxy" Apr 23 13:53:25.270096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270051 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kube-rbac-proxy" Apr 23 13:53:25.270096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270081 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" Apr 23 13:53:25.270096 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270090 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" Apr 23 13:53:25.270201 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270157 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kube-rbac-proxy" Apr 23 13:53:25.270201 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.270169 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="33060c2f-cf52-4fb3-a6d2-b785274a7767" containerName="kserve-container" Apr 23 13:53:25.274680 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.274658 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.278371 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.278145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1cfcc-predictor-serving-cert\"" Apr 23 13:53:25.278371 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.278186 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\"" Apr 23 13:53:25.284099 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.284059 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:53:25.437160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.437116 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.437380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.437237 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.437380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.437284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcqb\" (UniqueName: \"kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.538689 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.538600 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcqb\" (UniqueName: \"kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.538689 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.538658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.538931 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.538700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.538931 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:53:25.538806 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-serving-cert: secret "success-200-isvc-1cfcc-predictor-serving-cert" not found Apr 23 13:53:25.538931 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:53:25.538883 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls podName:f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00 nodeName:}" failed. No retries permitted until 2026-04-23 13:53:26.038865696 +0000 UTC m=+1273.421394805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls") pod "success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" (UID: "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00") : secret "success-200-isvc-1cfcc-predictor-serving-cert" not found Apr 23 13:53:25.539363 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.539346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.549517 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.549491 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcqb\" (UniqueName: \"kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:25.993835 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.993801 2567 generic.go:358] "Generic (PLEG): container finished" podID="feee8d88-99ee-4507-8190-0cb66713e21d" containerID="3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a" exitCode=2 Apr 23 13:53:25.994021 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:25.993883 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerDied","Data":"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a"} Apr 23 13:53:26.043279 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.043243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:26.045647 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.045626 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") pod \"success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:26.188312 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.188279 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:26.313303 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.313011 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:53:26.315807 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:53:26.315772 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f2fe6e_2e3d_422b_9e9c_8a4fbd26be00.slice/crio-647cf372ca2efb96044d7d47b1395a96b94a2c944bb8e442b2496066698614d7 WatchSource:0}: Error finding container 647cf372ca2efb96044d7d47b1395a96b94a2c944bb8e442b2496066698614d7: Status 404 returned error can't find the container with id 647cf372ca2efb96044d7d47b1395a96b94a2c944bb8e442b2496066698614d7 Apr 23 13:53:26.997985 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.997945 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerStarted","Data":"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947"} Apr 23 13:53:26.997985 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.997988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerStarted","Data":"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17"} Apr 23 13:53:26.998380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.998003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerStarted","Data":"647cf372ca2efb96044d7d47b1395a96b94a2c944bb8e442b2496066698614d7"} Apr 23 13:53:26.998380 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:26.998139 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:27.018441 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:27.018391 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podStartSLOduration=2.018378047 podStartE2EDuration="2.018378047s" podCreationTimestamp="2026-04-23 13:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:27.016730827 +0000 UTC m=+1274.399259952" watchObservedRunningTime="2026-04-23 13:53:27.018378047 +0000 UTC m=+1274.400907177" Apr 23 13:53:27.899343 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:27.899298 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 13:53:28.001501 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.001470 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:28.002937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.002907 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:53:28.790511 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.790487 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:53:28.971392 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.971287 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\") pod \"feee8d88-99ee-4507-8190-0cb66713e21d\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " Apr 23 13:53:28.971392 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.971348 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4vwm\" (UniqueName: \"kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm\") pod \"feee8d88-99ee-4507-8190-0cb66713e21d\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " Apr 23 13:53:28.971794 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.971441 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") pod \"feee8d88-99ee-4507-8190-0cb66713e21d\" (UID: \"feee8d88-99ee-4507-8190-0cb66713e21d\") " Apr 23 13:53:28.971794 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.971653 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f7bdd-kube-rbac-proxy-sar-config") pod "feee8d88-99ee-4507-8190-0cb66713e21d" (UID: "feee8d88-99ee-4507-8190-0cb66713e21d"). InnerVolumeSpecName "success-200-isvc-f7bdd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:53:28.973538 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.973516 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm" (OuterVolumeSpecName: "kube-api-access-t4vwm") pod "feee8d88-99ee-4507-8190-0cb66713e21d" (UID: "feee8d88-99ee-4507-8190-0cb66713e21d"). InnerVolumeSpecName "kube-api-access-t4vwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:28.973588 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:28.973562 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "feee8d88-99ee-4507-8190-0cb66713e21d" (UID: "feee8d88-99ee-4507-8190-0cb66713e21d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:29.006301 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006266 2567 generic.go:358] "Generic (PLEG): container finished" podID="feee8d88-99ee-4507-8190-0cb66713e21d" containerID="73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8" exitCode=0 Apr 23 13:53:29.006442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006340 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" Apr 23 13:53:29.006442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerDied","Data":"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8"} Apr 23 13:53:29.006442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006380 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt" event={"ID":"feee8d88-99ee-4507-8190-0cb66713e21d","Type":"ContainerDied","Data":"2a01a73b75047a39636f4db28e92c992ec9292a2bcc67994ae4369b65542ff4f"} Apr 23 13:53:29.006442 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006394 2567 scope.go:117] "RemoveContainer" containerID="3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a" Apr 23 13:53:29.006870 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.006841 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:53:29.015005 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.014990 2567 scope.go:117] "RemoveContainer" containerID="73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8" Apr 23 13:53:29.023648 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.023627 2567 scope.go:117] "RemoveContainer" containerID="3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a" Apr 23 13:53:29.023929 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:53:29.023911 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a\": container with ID starting with 3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a not found: ID does not exist" containerID="3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a" Apr 23 13:53:29.023991 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.023935 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a"} err="failed to get container status \"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a\": rpc error: code = NotFound desc = could not find container \"3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a\": container with ID starting with 3301edbcab368ddaf7fd3ae817718334f2fac80cbe21e425f03f58806f69365a not found: ID does not exist" Apr 23 13:53:29.023991 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.023951 2567 scope.go:117] "RemoveContainer" containerID="73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8" Apr 23 13:53:29.024173 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:53:29.024152 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8\": container with ID starting with 73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8 not found: ID does not exist" containerID="73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8" Apr 23 13:53:29.024282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.024177 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8"} err="failed to get container status \"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8\": rpc error: code = NotFound desc = could not find container \"73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8\": container with ID starting with 73862e1964459bfabc74e81a6912022e0cc2377e25b4df59a5ebe90f427767d8 not found: ID does not exist" Apr 23 13:53:29.029445 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.029416 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:53:29.033735 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.033712 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f7bdd-predictor-54969f468c-dq4nt"] Apr 23 13:53:29.072068 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.072039 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/feee8d88-99ee-4507-8190-0cb66713e21d-success-200-isvc-f7bdd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:53:29.072068 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.072065 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4vwm\" (UniqueName: \"kubernetes.io/projected/feee8d88-99ee-4507-8190-0cb66713e21d-kube-api-access-t4vwm\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:53:29.072248 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.072077 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/feee8d88-99ee-4507-8190-0cb66713e21d-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:53:29.210135 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:29.210104 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" path="/var/lib/kubelet/pods/feee8d88-99ee-4507-8190-0cb66713e21d/volumes" Apr 23 13:53:34.011455 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:34.011423 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:53:34.011937 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:34.011910 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:53:37.899165 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:37.899130 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:53:44.012335 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:44.012295 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:53:54.012526 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:54.012487 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:53:59.783186 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.783144 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:53:59.783670 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.783457 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" containerID="cri-o://2a600223262e09935b0a5a0d82f27287422b9c0160c900237536a8c6ff0b5ae5" gracePeriod=30 Apr 23 13:53:59.783670 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.783484 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kube-rbac-proxy" containerID="cri-o://c0560c69d0e18ae51ad7f27faa752d8286ba3202d833460646c3aeaf54794c88" gracePeriod=30 Apr 23 13:53:59.831800 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.831769 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 13:53:59.832112 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832100 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kube-rbac-proxy" Apr 23 13:53:59.832160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832116 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kube-rbac-proxy" Apr 23 13:53:59.832160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832135 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" Apr 23 13:53:59.832160 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832141 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" Apr 23 13:53:59.832263 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832216 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kserve-container" Apr 23 13:53:59.832263 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.832242 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="feee8d88-99ee-4507-8190-0cb66713e21d" containerName="kube-rbac-proxy" Apr 23 13:53:59.835399 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.835380 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:53:59.838340 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.838317 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-37fbd-predictor-serving-cert\"" Apr 23 13:53:59.838430 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.838336 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\"" Apr 23 13:53:59.848343 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.848319 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 13:53:59.925040 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.925007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzqkf\" (UniqueName: \"kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:53:59.925220 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.925131 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:53:59.925220 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:53:59.925155 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.025871 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.025819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.026055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.025876 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.026055 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.025946 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzqkf\" (UniqueName: \"kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.026587 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.026560 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.028683 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.028659 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.036535 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.036477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzqkf\" (UniqueName: \"kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf\") pod \"success-200-isvc-37fbd-predictor-8f56b4685-mbxhd\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.101250 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.101198 2567 generic.go:358] "Generic (PLEG): container finished" podID="392cb698-8a42-421c-bc37-5884e4f47df1" containerID="c0560c69d0e18ae51ad7f27faa752d8286ba3202d833460646c3aeaf54794c88" exitCode=2 Apr 23 13:54:00.101411 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.101276 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerDied","Data":"c0560c69d0e18ae51ad7f27faa752d8286ba3202d833460646c3aeaf54794c88"} Apr 23 13:54:00.145967 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.145936 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:00.272946 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:00.272919 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 13:54:00.275152 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:54:00.275125 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d08b04c_3401_4df1_b1c5_82a8543e716d.slice/crio-0194019a2e79b88166854c12176a7898e27eedf8b18ba4b450110f1d2241b975 WatchSource:0}: Error finding container 0194019a2e79b88166854c12176a7898e27eedf8b18ba4b450110f1d2241b975: Status 404 returned error can't find the container with id 0194019a2e79b88166854c12176a7898e27eedf8b18ba4b450110f1d2241b975 Apr 23 13:54:01.105437 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:01.105399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerStarted","Data":"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3"} Apr 23 13:54:01.105437 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:01.105438 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerStarted","Data":"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265"} Apr 23 13:54:01.105847 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:01.105448 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerStarted","Data":"0194019a2e79b88166854c12176a7898e27eedf8b18ba4b450110f1d2241b975"} Apr 23 13:54:01.105847 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:01.105560 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:01.126282 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:01.126216 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podStartSLOduration=2.126201317 podStartE2EDuration="2.126201317s" podCreationTimestamp="2026-04-23 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:01.124187464 +0000 UTC m=+1308.506716594" watchObservedRunningTime="2026-04-23 13:54:01.126201317 +0000 UTC m=+1308.508730447" Apr 23 13:54:02.108387 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:02.108351 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:02.109621 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:02.109594 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:02.894000 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:02.893963 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 23 13:54:03.112671 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.112641 2567 generic.go:358] "Generic (PLEG): container finished" podID="392cb698-8a42-421c-bc37-5884e4f47df1" containerID="2a600223262e09935b0a5a0d82f27287422b9c0160c900237536a8c6ff0b5ae5" exitCode=0 Apr 23 13:54:03.113098 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.112713 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerDied","Data":"2a600223262e09935b0a5a0d82f27287422b9c0160c900237536a8c6ff0b5ae5"} Apr 23 13:54:03.113098 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.113001 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:03.127189 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.127168 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:54:03.251558 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.251469 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") pod \"392cb698-8a42-421c-bc37-5884e4f47df1\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " Apr 23 13:54:03.251721 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.251561 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzwtx\" (UniqueName: \"kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx\") pod \"392cb698-8a42-421c-bc37-5884e4f47df1\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " Apr 23 13:54:03.251721 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.251600 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\") pod \"392cb698-8a42-421c-bc37-5884e4f47df1\" (UID: \"392cb698-8a42-421c-bc37-5884e4f47df1\") " Apr 23 13:54:03.252064 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.252036 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-2d7b2-kube-rbac-proxy-sar-config") pod "392cb698-8a42-421c-bc37-5884e4f47df1" (UID: "392cb698-8a42-421c-bc37-5884e4f47df1"). InnerVolumeSpecName "success-200-isvc-2d7b2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:03.253652 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.253628 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "392cb698-8a42-421c-bc37-5884e4f47df1" (UID: "392cb698-8a42-421c-bc37-5884e4f47df1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:03.253743 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.253666 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx" (OuterVolumeSpecName: "kube-api-access-rzwtx") pod "392cb698-8a42-421c-bc37-5884e4f47df1" (UID: "392cb698-8a42-421c-bc37-5884e4f47df1"). InnerVolumeSpecName "kube-api-access-rzwtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:03.352611 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.352575 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/392cb698-8a42-421c-bc37-5884e4f47df1-success-200-isvc-2d7b2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:03.352611 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.352606 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/392cb698-8a42-421c-bc37-5884e4f47df1-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:03.352611 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:03.352617 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzwtx\" (UniqueName: \"kubernetes.io/projected/392cb698-8a42-421c-bc37-5884e4f47df1-kube-api-access-rzwtx\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:04.012701 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.012659 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 13:54:04.117604 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.117575 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" Apr 23 13:54:04.118040 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.117559 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g" event={"ID":"392cb698-8a42-421c-bc37-5884e4f47df1","Type":"ContainerDied","Data":"1bbf4f5688d46556c3d55edf81f83f87c22d8feb2a555bd505bf9aafb737de70"} Apr 23 13:54:04.118040 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.117710 2567 scope.go:117] "RemoveContainer" containerID="c0560c69d0e18ae51ad7f27faa752d8286ba3202d833460646c3aeaf54794c88" Apr 23 13:54:04.127302 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.127281 2567 scope.go:117] "RemoveContainer" containerID="2a600223262e09935b0a5a0d82f27287422b9c0160c900237536a8c6ff0b5ae5" Apr 23 13:54:04.144291 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.144266 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:54:04.150014 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:04.149990 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d7b2-predictor-5885f7c96c-tld6g"] Apr 23 13:54:05.210285 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:05.210252 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" path="/var/lib/kubelet/pods/392cb698-8a42-421c-bc37-5884e4f47df1/volumes" Apr 23 13:54:08.118082 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:08.118046 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:08.118740 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:08.118712 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:14.013215 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:14.013185 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:54:18.119089 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:18.119047 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:28.119029 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:28.118938 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:35.489760 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.489727 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:54:35.490194 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.490034 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" containerID="cri-o://5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17" gracePeriod=30 Apr 23 13:54:35.490194 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.490069 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kube-rbac-proxy" containerID="cri-o://8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947" gracePeriod=30 Apr 23 13:54:35.514694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.514666 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 13:54:35.515001 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.514990 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kube-rbac-proxy" Apr 23 13:54:35.515065 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.515002 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kube-rbac-proxy" Apr 23 13:54:35.515065 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.515016 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" Apr 23 13:54:35.515065 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.515023 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" Apr 23 13:54:35.515218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.515084 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kube-rbac-proxy" Apr 23 13:54:35.515218 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.515093 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="392cb698-8a42-421c-bc37-5884e4f47df1" containerName="kserve-container" Apr 23 13:54:35.517969 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.517951 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.520988 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.520966 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8d150-kube-rbac-proxy-sar-config\"" Apr 23 13:54:35.521087 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.521063 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-8d150-predictor-serving-cert\"" Apr 23 13:54:35.524694 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.524652 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 13:54:35.624741 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.624703 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.624963 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.624938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928hw\" (UniqueName: \"kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.625042 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.625026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.726285 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.726246 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.726470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.726342 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-928hw\" (UniqueName: \"kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.726470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.726399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.726470 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:54:35.726422 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-8d150-predictor-serving-cert: secret "success-200-isvc-8d150-predictor-serving-cert" not found Apr 23 13:54:35.726651 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:54:35.726507 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls podName:5d1de448-78cb-4a0a-8927-5c71196199d6 nodeName:}" failed. No retries permitted until 2026-04-23 13:54:36.226483592 +0000 UTC m=+1343.609012705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls") pod "success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" (UID: "5d1de448-78cb-4a0a-8927-5c71196199d6") : secret "success-200-isvc-8d150-predictor-serving-cert" not found Apr 23 13:54:35.727102 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.727080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:35.736220 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:35.736188 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-928hw\" (UniqueName: \"kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:36.218688 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.218654 2567 generic.go:358] "Generic (PLEG): container finished" podID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerID="8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947" exitCode=2 Apr 23 13:54:36.218852 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.218726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerDied","Data":"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947"} Apr 23 13:54:36.232090 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.232066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:36.234509 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.234478 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") pod \"success-200-isvc-8d150-predictor-5cd95d7749-hpnmb\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:36.430255 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.430202 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:36.574472 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:36.574439 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 13:54:36.577411 ip-10-0-141-176 kubenswrapper[2567]: W0423 13:54:36.577377 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1de448_78cb_4a0a_8927_5c71196199d6.slice/crio-d550933afa76e48d0ad35b623abb337bd60f8881de9d786b8348e606126b1928 WatchSource:0}: Error finding container d550933afa76e48d0ad35b623abb337bd60f8881de9d786b8348e606126b1928: Status 404 returned error can't find the container with id d550933afa76e48d0ad35b623abb337bd60f8881de9d786b8348e606126b1928 Apr 23 13:54:37.223520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.223483 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerStarted","Data":"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad"} Apr 23 13:54:37.223520 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.223521 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerStarted","Data":"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5"} Apr 23 13:54:37.223759 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.223531 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerStarted","Data":"d550933afa76e48d0ad35b623abb337bd60f8881de9d786b8348e606126b1928"} Apr 23 13:54:37.223759 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.223719 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:37.223759 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.223745 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:37.225312 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.225280 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:54:37.247731 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:37.247675 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podStartSLOduration=2.2476603219999998 podStartE2EDuration="2.247660322s" podCreationTimestamp="2026-04-23 13:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:37.247389699 +0000 UTC m=+1344.629918829" watchObservedRunningTime="2026-04-23 13:54:37.247660322 +0000 UTC m=+1344.630189451" Apr 23 13:54:38.119191 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.119141 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 23 13:54:38.228022 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.227971 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:54:38.941659 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.941636 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:54:38.952504 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.952484 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxcqb\" (UniqueName: \"kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb\") pod \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " Apr 23 13:54:38.952579 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.952531 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") pod \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " Apr 23 13:54:38.952579 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.952567 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\") pod \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\" (UID: \"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00\") " Apr 23 13:54:38.952949 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.952921 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-1cfcc-kube-rbac-proxy-sar-config") pod "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" (UID: "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00"). InnerVolumeSpecName "success-200-isvc-1cfcc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:54:38.954622 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.954594 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" (UID: "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:38.954719 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:38.954664 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb" (OuterVolumeSpecName: "kube-api-access-bxcqb") pod "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" (UID: "f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00"). InnerVolumeSpecName "kube-api-access-bxcqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:39.053601 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.053514 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bxcqb\" (UniqueName: \"kubernetes.io/projected/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-kube-api-access-bxcqb\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:39.053601 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.053544 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:39.053601 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.053555 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00-success-200-isvc-1cfcc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 13:54:39.233765 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.233727 2567 generic.go:358] "Generic (PLEG): container finished" podID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerID="5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17" exitCode=0 Apr 23 13:54:39.234307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.233816 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" Apr 23 13:54:39.234307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.233816 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerDied","Data":"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17"} Apr 23 13:54:39.234307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.233860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv" event={"ID":"f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00","Type":"ContainerDied","Data":"647cf372ca2efb96044d7d47b1395a96b94a2c944bb8e442b2496066698614d7"} Apr 23 13:54:39.234307 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.233877 2567 scope.go:117] "RemoveContainer" containerID="8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947" Apr 23 13:54:39.242779 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.242746 2567 scope.go:117] "RemoveContainer" containerID="5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17" Apr 23 13:54:39.250131 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.250108 2567 scope.go:117] "RemoveContainer" containerID="8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947" Apr 23 13:54:39.250415 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:54:39.250396 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947\": container with ID starting with 8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947 not found: ID does not exist" containerID="8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947" Apr 23 13:54:39.250470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.250425 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947"} err="failed to get container status \"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947\": rpc error: code = NotFound desc = could not find container \"8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947\": container with ID starting with 8d9bf972f71a02c65257a678d7adc1f4ec64571119b52eab0dde45803b25e947 not found: ID does not exist" Apr 23 13:54:39.250470 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.250444 2567 scope.go:117] "RemoveContainer" containerID="5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17" Apr 23 13:54:39.250708 ip-10-0-141-176 kubenswrapper[2567]: E0423 13:54:39.250691 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17\": container with ID starting with 5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17 not found: ID does not exist" containerID="5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17" Apr 23 13:54:39.250761 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.250713 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17"} err="failed to get container status \"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17\": rpc error: code = NotFound desc = could not find container \"5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17\": container with ID starting with 5791114eceb1a732b0b66a7d5b90e166389cd85606930a90d48c0489a64ffb17 not found: ID does not exist" Apr 23 13:54:39.256241 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.256198 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:54:39.259109 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:39.259083 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1cfcc-predictor-5b5dd86bb9-h48hv"] Apr 23 13:54:41.209826 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:41.209792 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" path="/var/lib/kubelet/pods/f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00/volumes" Apr 23 13:54:43.233873 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:43.233839 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:54:43.234444 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:43.234413 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:54:48.119390 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:48.119361 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 13:54:53.234583 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:54:53.234538 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:55:03.235356 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:55:03.235314 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:55:13.234391 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:55:13.234342 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 13:55:23.235382 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:55:23.235342 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 13:57:15.424008 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:57:15.423888 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 13:57:15.428111 ip-10-0-141-176 kubenswrapper[2567]: I0423 13:57:15.426307 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:02:15.445180 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:02:15.445060 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:02:15.454695 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:02:15.454670 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:03:14.632545 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.632509 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 14:03:14.633051 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.632807 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" containerID="cri-o://70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265" gracePeriod=30 Apr 23 14:03:14.633051 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.632898 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kube-rbac-proxy" containerID="cri-o://c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3" gracePeriod=30 Apr 23 14:03:14.732122 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732087 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:03:14.732463 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732450 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" Apr 23 14:03:14.732515 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732464 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" Apr 23 14:03:14.732515 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732474 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kube-rbac-proxy" Apr 23 14:03:14.732515 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732479 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kube-rbac-proxy" Apr 23 14:03:14.732615 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732546 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kube-rbac-proxy" Apr 23 14:03:14.732615 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.732554 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4f2fe6e-2e3d-422b-9e9c-8a4fbd26be00" containerName="kserve-container" Apr 23 14:03:14.735545 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.735530 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.738520 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.738493 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5035e-kube-rbac-proxy-sar-config\"" Apr 23 14:03:14.738658 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.738493 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5035e-predictor-serving-cert\"" Apr 23 14:03:14.756527 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.756501 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:03:14.850583 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.850542 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.850782 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.850607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlngj\" (UniqueName: \"kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.850782 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.850700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.951638 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.951552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.951638 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.951614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.951860 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.951649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlngj\" (UniqueName: \"kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.952292 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.952267 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.954074 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.954055 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:14.962155 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:14.962130 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlngj\" (UniqueName: \"kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj\") pod \"success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:15.047359 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.047314 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:15.174502 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.174469 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:03:15.177618 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:03:15.177587 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed1d85a_d2dc_4b9d_b42f_de90afc04a20.slice/crio-08fe0eb18e35be4c65eacdf883e04dfa38da492b205fd217696d1b44d00c43a8 WatchSource:0}: Error finding container 08fe0eb18e35be4c65eacdf883e04dfa38da492b205fd217696d1b44d00c43a8: Status 404 returned error can't find the container with id 08fe0eb18e35be4c65eacdf883e04dfa38da492b205fd217696d1b44d00c43a8 Apr 23 14:03:15.179369 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.179354 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:03:15.719971 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.719933 2567 generic.go:358] "Generic (PLEG): container finished" podID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerID="c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3" exitCode=2 Apr 23 14:03:15.720427 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.720007 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerDied","Data":"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3"} Apr 23 14:03:15.721624 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.721593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerStarted","Data":"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa"} Apr 23 14:03:15.721624 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.721632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerStarted","Data":"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b"} Apr 23 14:03:15.721775 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.721645 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerStarted","Data":"08fe0eb18e35be4c65eacdf883e04dfa38da492b205fd217696d1b44d00c43a8"} Apr 23 14:03:15.721809 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.721786 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:15.742156 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:15.742096 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podStartSLOduration=1.742077812 podStartE2EDuration="1.742077812s" podCreationTimestamp="2026-04-23 14:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:15.740927286 +0000 UTC m=+1863.123456416" watchObservedRunningTime="2026-04-23 14:03:15.742077812 +0000 UTC m=+1863.124606943" Apr 23 14:03:16.725109 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:16.725070 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:16.726359 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:16.726332 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:17.728609 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:17.728571 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:17.978823 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:17.978767 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 14:03:18.079925 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.079890 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config\") pod \"3d08b04c-3401-4df1-b1c5-82a8543e716d\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " Apr 23 14:03:18.080100 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.079945 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls\") pod \"3d08b04c-3401-4df1-b1c5-82a8543e716d\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " Apr 23 14:03:18.080100 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.079999 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzqkf\" (UniqueName: \"kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf\") pod \"3d08b04c-3401-4df1-b1c5-82a8543e716d\" (UID: \"3d08b04c-3401-4df1-b1c5-82a8543e716d\") " Apr 23 14:03:18.080295 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.080271 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-37fbd-kube-rbac-proxy-sar-config") pod "3d08b04c-3401-4df1-b1c5-82a8543e716d" (UID: "3d08b04c-3401-4df1-b1c5-82a8543e716d"). InnerVolumeSpecName "success-200-isvc-37fbd-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:18.082173 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.082142 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf" (OuterVolumeSpecName: "kube-api-access-rzqkf") pod "3d08b04c-3401-4df1-b1c5-82a8543e716d" (UID: "3d08b04c-3401-4df1-b1c5-82a8543e716d"). InnerVolumeSpecName "kube-api-access-rzqkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:18.082316 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.082190 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3d08b04c-3401-4df1-b1c5-82a8543e716d" (UID: "3d08b04c-3401-4df1-b1c5-82a8543e716d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:18.180810 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.180777 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-37fbd-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3d08b04c-3401-4df1-b1c5-82a8543e716d-success-200-isvc-37fbd-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:18.180810 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.180803 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d08b04c-3401-4df1-b1c5-82a8543e716d-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:18.180810 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.180814 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzqkf\" (UniqueName: \"kubernetes.io/projected/3d08b04c-3401-4df1-b1c5-82a8543e716d-kube-api-access-rzqkf\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:18.732343 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.732311 2567 generic.go:358] "Generic (PLEG): container finished" podID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerID="70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265" exitCode=0 Apr 23 14:03:18.732753 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.732389 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" Apr 23 14:03:18.732753 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.732400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerDied","Data":"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265"} Apr 23 14:03:18.732753 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.732439 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd" event={"ID":"3d08b04c-3401-4df1-b1c5-82a8543e716d","Type":"ContainerDied","Data":"0194019a2e79b88166854c12176a7898e27eedf8b18ba4b450110f1d2241b975"} Apr 23 14:03:18.732753 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.732455 2567 scope.go:117] "RemoveContainer" containerID="c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3" Apr 23 14:03:18.740702 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.740678 2567 scope.go:117] "RemoveContainer" containerID="70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265" Apr 23 14:03:18.747662 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.747646 2567 scope.go:117] "RemoveContainer" containerID="c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3" Apr 23 14:03:18.747912 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:03:18.747891 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3\": container with ID starting with c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3 not found: ID does not exist" containerID="c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3" Apr 23 14:03:18.747981 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.747924 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3"} err="failed to get container status \"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3\": rpc error: code = NotFound desc = could not find container \"c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3\": container with ID starting with c73eb77d7c1753402a80815801ff180d478e7331718314bedc9a72928b85fbf3 not found: ID does not exist" Apr 23 14:03:18.747981 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.747948 2567 scope.go:117] "RemoveContainer" containerID="70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265" Apr 23 14:03:18.748170 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:03:18.748153 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265\": container with ID starting with 70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265 not found: ID does not exist" containerID="70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265" Apr 23 14:03:18.748235 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.748175 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265"} err="failed to get container status \"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265\": rpc error: code = NotFound desc = could not find container \"70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265\": container with ID starting with 70bfb1ab0db12b989e8abfc96196e63f4c09c7201ad1c5dbc81cfcdbc3500265 not found: ID does not exist" Apr 23 14:03:18.755313 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.755290 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 14:03:18.760293 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:18.760268 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-37fbd-predictor-8f56b4685-mbxhd"] Apr 23 14:03:19.209704 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:19.209669 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" path="/var/lib/kubelet/pods/3d08b04c-3401-4df1-b1c5-82a8543e716d/volumes" Apr 23 14:03:22.732645 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:22.732617 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:03:22.733115 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:22.733023 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:32.733551 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:32.733460 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:42.733594 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:42.733542 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:50.403962 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.403926 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 14:03:50.404513 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.404269 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" containerID="cri-o://b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5" gracePeriod=30 Apr 23 14:03:50.404513 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.404311 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kube-rbac-proxy" containerID="cri-o://b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad" gracePeriod=30 Apr 23 14:03:50.449277 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449238 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:03:50.449722 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449706 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kube-rbac-proxy" Apr 23 14:03:50.449796 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449724 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kube-rbac-proxy" Apr 23 14:03:50.449796 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449738 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" Apr 23 14:03:50.449796 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449747 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" Apr 23 14:03:50.449888 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449821 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kserve-container" Apr 23 14:03:50.449888 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.449836 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d08b04c-3401-4df1-b1c5-82a8543e716d" containerName="kube-rbac-proxy" Apr 23 14:03:50.453290 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.453269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.457323 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.457298 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-feffa-predictor-serving-cert\"" Apr 23 14:03:50.457613 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.457598 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-feffa-kube-rbac-proxy-sar-config\"" Apr 23 14:03:50.470776 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.470743 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:03:50.554150 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.554104 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.554150 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.554154 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.554374 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.554290 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48tk\" (UniqueName: \"kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.655111 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.655028 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.655111 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.655072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.655372 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.655135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g48tk\" (UniqueName: \"kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.655696 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.655666 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.657510 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.657486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.665464 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.665443 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48tk\" (UniqueName: \"kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk\") pod \"success-200-isvc-feffa-predictor-5cff45889-498mt\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.764026 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.763981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:50.834192 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.834138 2567 generic.go:358] "Generic (PLEG): container finished" podID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerID="b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad" exitCode=2 Apr 23 14:03:50.834381 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.834273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerDied","Data":"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad"} Apr 23 14:03:50.894872 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:50.894843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:03:50.897073 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:03:50.897044 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0ec7bd_e53b_4dbf_923e_d7303fdf8db4.slice/crio-9375fa7f7577a6363719c8a1cf353e699d95eac28335d27fde257684435a5733 WatchSource:0}: Error finding container 9375fa7f7577a6363719c8a1cf353e699d95eac28335d27fde257684435a5733: Status 404 returned error can't find the container with id 9375fa7f7577a6363719c8a1cf353e699d95eac28335d27fde257684435a5733 Apr 23 14:03:51.838647 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:51.838612 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerStarted","Data":"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b"} Apr 23 14:03:51.838647 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:51.838647 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerStarted","Data":"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b"} Apr 23 14:03:51.838647 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:51.838658 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerStarted","Data":"9375fa7f7577a6363719c8a1cf353e699d95eac28335d27fde257684435a5733"} Apr 23 14:03:51.839251 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:51.838781 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:51.860494 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:51.860443 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podStartSLOduration=1.8604299979999999 podStartE2EDuration="1.860429998s" podCreationTimestamp="2026-04-23 14:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:51.858732987 +0000 UTC m=+1899.241262121" watchObservedRunningTime="2026-04-23 14:03:51.860429998 +0000 UTC m=+1899.242959128" Apr 23 14:03:52.733705 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:52.733655 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:8080: connect: connection refused" Apr 23 14:03:52.841786 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:52.841759 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:52.843138 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:52.843103 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:03:53.229158 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.229107 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.30:8643/healthz\": dial tcp 10.133.0.30:8643: connect: connection refused" Apr 23 14:03:53.235165 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.235130 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 14:03:53.751249 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.751200 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 14:03:53.846394 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846307 2567 generic.go:358] "Generic (PLEG): container finished" podID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerID="b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5" exitCode=0 Apr 23 14:03:53.846394 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846385 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" Apr 23 14:03:53.846906 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846389 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerDied","Data":"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5"} Apr 23 14:03:53.846906 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846514 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb" event={"ID":"5d1de448-78cb-4a0a-8927-5c71196199d6","Type":"ContainerDied","Data":"d550933afa76e48d0ad35b623abb337bd60f8881de9d786b8348e606126b1928"} Apr 23 14:03:53.846906 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846548 2567 scope.go:117] "RemoveContainer" containerID="b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad" Apr 23 14:03:53.847050 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.846991 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:03:53.854709 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.854692 2567 scope.go:117] "RemoveContainer" containerID="b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5" Apr 23 14:03:53.861711 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.861681 2567 scope.go:117] "RemoveContainer" containerID="b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad" Apr 23 14:03:53.861980 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:03:53.861962 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad\": container with ID starting with b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad not found: ID does not exist" containerID="b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad" Apr 23 14:03:53.862039 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.861989 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad"} err="failed to get container status \"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad\": rpc error: code = NotFound desc = could not find container \"b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad\": container with ID starting with b65e9ecdea70c79804891440b9d6c448d6faed75479df4d733f62bf64f8c8dad not found: ID does not exist" Apr 23 14:03:53.862039 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.862009 2567 scope.go:117] "RemoveContainer" containerID="b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5" Apr 23 14:03:53.862293 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:03:53.862262 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5\": container with ID starting with b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5 not found: ID does not exist" containerID="b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5" Apr 23 14:03:53.862355 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.862299 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5"} err="failed to get container status \"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5\": rpc error: code = NotFound desc = could not find container \"b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5\": container with ID starting with b522ac90d4937dbbc20ec14a6d1383a84d0b450e2c87c412c04e851b8bdbe5d5 not found: ID does not exist" Apr 23 14:03:53.881608 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.881581 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") pod \"5d1de448-78cb-4a0a-8927-5c71196199d6\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " Apr 23 14:03:53.881734 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.881635 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-928hw\" (UniqueName: \"kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw\") pod \"5d1de448-78cb-4a0a-8927-5c71196199d6\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " Apr 23 14:03:53.881792 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.881730 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config\") pod \"5d1de448-78cb-4a0a-8927-5c71196199d6\" (UID: \"5d1de448-78cb-4a0a-8927-5c71196199d6\") " Apr 23 14:03:53.882085 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.882063 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-8d150-kube-rbac-proxy-sar-config") pod "5d1de448-78cb-4a0a-8927-5c71196199d6" (UID: "5d1de448-78cb-4a0a-8927-5c71196199d6"). InnerVolumeSpecName "success-200-isvc-8d150-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:03:53.883737 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.883713 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw" (OuterVolumeSpecName: "kube-api-access-928hw") pod "5d1de448-78cb-4a0a-8927-5c71196199d6" (UID: "5d1de448-78cb-4a0a-8927-5c71196199d6"). InnerVolumeSpecName "kube-api-access-928hw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:03:53.883834 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.883738 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d1de448-78cb-4a0a-8927-5c71196199d6" (UID: "5d1de448-78cb-4a0a-8927-5c71196199d6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:03:53.983365 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.983322 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-8d150-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d1de448-78cb-4a0a-8927-5c71196199d6-success-200-isvc-8d150-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:53.983536 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.983435 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d1de448-78cb-4a0a-8927-5c71196199d6-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:53.983536 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:53.983454 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-928hw\" (UniqueName: \"kubernetes.io/projected/5d1de448-78cb-4a0a-8927-5c71196199d6-kube-api-access-928hw\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:03:54.168619 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:54.168584 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 14:03:54.174653 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:54.174627 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-8d150-predictor-5cd95d7749-hpnmb"] Apr 23 14:03:55.209832 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:55.209783 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" path="/var/lib/kubelet/pods/5d1de448-78cb-4a0a-8927-5c71196199d6/volumes" Apr 23 14:03:58.851373 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:58.851342 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:03:58.851751 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:03:58.851726 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:04:02.734013 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:02.733977 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:04:08.852800 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:08.852757 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:04:18.852725 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:18.852682 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:04:24.980070 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:24.980025 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:04:24.980606 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:24.980339 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" containerID="cri-o://bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b" gracePeriod=30 Apr 23 14:04:24.980606 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:24.980374 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kube-rbac-proxy" containerID="cri-o://0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa" gracePeriod=30 Apr 23 14:04:25.015733 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.015697 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:04:25.016087 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016073 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" Apr 23 14:04:25.016138 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016090 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" Apr 23 14:04:25.016138 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016111 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kube-rbac-proxy" Apr 23 14:04:25.016138 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016119 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kube-rbac-proxy" Apr 23 14:04:25.016269 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016187 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kserve-container" Apr 23 14:04:25.016269 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.016207 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d1de448-78cb-4a0a-8927-5c71196199d6" containerName="kube-rbac-proxy" Apr 23 14:04:25.020521 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.020495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.023580 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.023546 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a8c3e-predictor-serving-cert\"" Apr 23 14:04:25.023774 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.023753 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\"" Apr 23 14:04:25.030692 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.030658 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:04:25.128345 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.128299 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.128546 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.128464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hhr\" (UniqueName: \"kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.128546 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.128515 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.229163 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.229115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hhr\" (UniqueName: \"kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.229366 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.229173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.229366 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.229244 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.229479 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:04:25.229403 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-serving-cert: secret "success-200-isvc-a8c3e-predictor-serving-cert" not found Apr 23 14:04:25.229479 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:04:25.229463 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls podName:c88719ce-0c49-469d-b128-da335ca5a48e nodeName:}" failed. No retries permitted until 2026-04-23 14:04:25.729442927 +0000 UTC m=+1933.111972052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls") pod "success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" (UID: "c88719ce-0c49-469d-b128-da335ca5a48e") : secret "success-200-isvc-a8c3e-predictor-serving-cert" not found Apr 23 14:04:25.229965 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.229929 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.238507 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.238435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hhr\" (UniqueName: \"kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.741627 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.741585 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.744108 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.744080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") pod \"success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.932793 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.932757 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:25.943516 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.943478 2567 generic.go:358] "Generic (PLEG): container finished" podID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerID="0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa" exitCode=2 Apr 23 14:04:25.943667 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:25.943547 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerDied","Data":"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa"} Apr 23 14:04:26.061165 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.061133 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:04:26.064198 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:04:26.064169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc88719ce_0c49_469d_b128_da335ca5a48e.slice/crio-5be1d5335b65f79b9f94a43e7346f78eb80eb58387d70ef3ff5620391dcde90a WatchSource:0}: Error finding container 5be1d5335b65f79b9f94a43e7346f78eb80eb58387d70ef3ff5620391dcde90a: Status 404 returned error can't find the container with id 5be1d5335b65f79b9f94a43e7346f78eb80eb58387d70ef3ff5620391dcde90a Apr 23 14:04:26.948631 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.948596 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerStarted","Data":"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a"} Apr 23 14:04:26.948631 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.948638 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerStarted","Data":"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f"} Apr 23 14:04:26.948902 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.948653 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerStarted","Data":"5be1d5335b65f79b9f94a43e7346f78eb80eb58387d70ef3ff5620391dcde90a"} Apr 23 14:04:26.948902 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.948680 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:26.971270 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:26.971200 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podStartSLOduration=2.971186839 podStartE2EDuration="2.971186839s" podCreationTimestamp="2026-04-23 14:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:04:26.968959127 +0000 UTC m=+1934.351488260" watchObservedRunningTime="2026-04-23 14:04:26.971186839 +0000 UTC m=+1934.353715968" Apr 23 14:04:27.729686 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:27.729628 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.31:8643/healthz\": dial tcp 10.133.0.31:8643: connect: connection refused" Apr 23 14:04:27.952289 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:27.952254 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:27.953692 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:27.953661 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:04:28.427469 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.427442 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:04:28.565928 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.565829 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlngj\" (UniqueName: \"kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj\") pod \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " Apr 23 14:04:28.566107 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.565932 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls\") pod \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " Apr 23 14:04:28.566107 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.566016 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config\") pod \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\" (UID: \"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20\") " Apr 23 14:04:28.566414 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.566380 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-5035e-kube-rbac-proxy-sar-config") pod "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" (UID: "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20"). InnerVolumeSpecName "success-200-isvc-5035e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:04:28.567917 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.567893 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj" (OuterVolumeSpecName: "kube-api-access-jlngj") pod "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" (UID: "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20"). InnerVolumeSpecName "kube-api-access-jlngj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:04:28.568108 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.568085 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" (UID: "4ed1d85a-d2dc-4b9d-b42f-de90afc04a20"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:04:28.667462 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.667425 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.667462 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.667458 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-5035e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-success-200-isvc-5035e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.667672 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.667479 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlngj\" (UniqueName: \"kubernetes.io/projected/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20-kube-api-access-jlngj\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:04:28.852507 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.852410 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 14:04:28.956904 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.956863 2567 generic.go:358] "Generic (PLEG): container finished" podID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerID="bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b" exitCode=0 Apr 23 14:04:28.957079 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.956927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerDied","Data":"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b"} Apr 23 14:04:28.957079 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.956953 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" Apr 23 14:04:28.957079 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.956964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s" event={"ID":"4ed1d85a-d2dc-4b9d-b42f-de90afc04a20","Type":"ContainerDied","Data":"08fe0eb18e35be4c65eacdf883e04dfa38da492b205fd217696d1b44d00c43a8"} Apr 23 14:04:28.957079 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.956981 2567 scope.go:117] "RemoveContainer" containerID="0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa" Apr 23 14:04:28.957671 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.957637 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:04:28.965921 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.965903 2567 scope.go:117] "RemoveContainer" containerID="bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b" Apr 23 14:04:28.973298 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.973279 2567 scope.go:117] "RemoveContainer" containerID="0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa" Apr 23 14:04:28.973561 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:04:28.973541 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa\": container with ID starting with 0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa not found: ID does not exist" containerID="0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa" Apr 23 14:04:28.973624 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.973569 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa"} err="failed to get container status \"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa\": rpc error: code = NotFound desc = could not find container \"0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa\": container with ID starting with 0cc706739c28e94e0a62c023c8a28a528eaa9fbe09b9f2f8aa39cce1e62ce6aa not found: ID does not exist" Apr 23 14:04:28.973624 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.973586 2567 scope.go:117] "RemoveContainer" containerID="bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b" Apr 23 14:04:28.973816 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:04:28.973799 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b\": container with ID starting with bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b not found: ID does not exist" containerID="bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b" Apr 23 14:04:28.973859 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.973821 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b"} err="failed to get container status \"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b\": rpc error: code = NotFound desc = could not find container \"bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b\": container with ID starting with bbd7047d2db98b167d472a36bb44801f98c4e4fb03e2b508fa2ca464dfb1fc0b not found: ID does not exist" Apr 23 14:04:28.980846 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.980821 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:04:28.983593 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:28.983571 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5035e-predictor-6cc7687ff7-sbk7s"] Apr 23 14:04:29.210341 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:29.210310 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" path="/var/lib/kubelet/pods/4ed1d85a-d2dc-4b9d-b42f-de90afc04a20/volumes" Apr 23 14:04:33.963117 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:33.963085 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:04:33.963566 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:33.963539 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:04:38.853472 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:38.853440 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:04:43.964245 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:43.964183 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:04:53.964413 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:04:53.964368 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:05:03.964068 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:05:03.964026 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 14:05:13.964409 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:05:13.964375 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:07:15.466993 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:07:15.466877 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:07:15.476878 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:07:15.476856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:12:15.487953 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:12:15.487846 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:12:15.497904 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:12:15.497880 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:13:39.831342 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:39.831304 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:13:39.831841 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:39.831572 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" containerID="cri-o://90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f" gracePeriod=30 Apr 23 14:13:39.831841 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:39.831598 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kube-rbac-proxy" containerID="cri-o://d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a" gracePeriod=30 Apr 23 14:13:40.580141 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:40.580102 2567 generic.go:358] "Generic (PLEG): container finished" podID="c88719ce-0c49-469d-b128-da335ca5a48e" containerID="d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a" exitCode=2 Apr 23 14:13:40.580338 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:40.580174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerDied","Data":"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a"} Apr 23 14:13:42.970692 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:42.970666 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:13:43.020823 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.020796 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") pod \"c88719ce-0c49-469d-b128-da335ca5a48e\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " Apr 23 14:13:43.020988 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.020831 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hhr\" (UniqueName: \"kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr\") pod \"c88719ce-0c49-469d-b128-da335ca5a48e\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " Apr 23 14:13:43.020988 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.020855 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\") pod \"c88719ce-0c49-469d-b128-da335ca5a48e\" (UID: \"c88719ce-0c49-469d-b128-da335ca5a48e\") " Apr 23 14:13:43.021332 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.021306 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a8c3e-kube-rbac-proxy-sar-config") pod "c88719ce-0c49-469d-b128-da335ca5a48e" (UID: "c88719ce-0c49-469d-b128-da335ca5a48e"). InnerVolumeSpecName "success-200-isvc-a8c3e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:13:43.022923 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.022890 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c88719ce-0c49-469d-b128-da335ca5a48e" (UID: "c88719ce-0c49-469d-b128-da335ca5a48e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:13:43.022923 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.022897 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr" (OuterVolumeSpecName: "kube-api-access-l5hhr") pod "c88719ce-0c49-469d-b128-da335ca5a48e" (UID: "c88719ce-0c49-469d-b128-da335ca5a48e"). InnerVolumeSpecName "kube-api-access-l5hhr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:13:43.121405 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.121373 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c88719ce-0c49-469d-b128-da335ca5a48e-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:13:43.121405 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.121402 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5hhr\" (UniqueName: \"kubernetes.io/projected/c88719ce-0c49-469d-b128-da335ca5a48e-kube-api-access-l5hhr\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:13:43.121593 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.121413 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c88719ce-0c49-469d-b128-da335ca5a48e-success-200-isvc-a8c3e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:13:43.592271 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.592160 2567 generic.go:358] "Generic (PLEG): container finished" podID="c88719ce-0c49-469d-b128-da335ca5a48e" containerID="90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f" exitCode=0 Apr 23 14:13:43.592271 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.592209 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerDied","Data":"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f"} Apr 23 14:13:43.592497 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.592269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" event={"ID":"c88719ce-0c49-469d-b128-da335ca5a48e","Type":"ContainerDied","Data":"5be1d5335b65f79b9f94a43e7346f78eb80eb58387d70ef3ff5620391dcde90a"} Apr 23 14:13:43.592497 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.592292 2567 scope.go:117] "RemoveContainer" containerID="d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a" Apr 23 14:13:43.592497 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.592320 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr" Apr 23 14:13:43.600450 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.600431 2567 scope.go:117] "RemoveContainer" containerID="90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f" Apr 23 14:13:43.607260 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.607242 2567 scope.go:117] "RemoveContainer" containerID="d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a" Apr 23 14:13:43.607490 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:13:43.607474 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a\": container with ID starting with d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a not found: ID does not exist" containerID="d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a" Apr 23 14:13:43.607542 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.607498 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a"} err="failed to get container status \"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a\": rpc error: code = NotFound desc = could not find container \"d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a\": container with ID starting with d31d9afc29005cee8e82a02dcbd531a4455904c606cdc56f5bf43fc39de5122a not found: ID does not exist" Apr 23 14:13:43.607542 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.607515 2567 scope.go:117] "RemoveContainer" containerID="90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f" Apr 23 14:13:43.607713 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:13:43.607697 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f\": container with ID starting with 90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f not found: ID does not exist" containerID="90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f" Apr 23 14:13:43.607756 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.607716 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f"} err="failed to get container status \"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f\": rpc error: code = NotFound desc = could not find container \"90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f\": container with ID starting with 90bc115a948f0fee8fa47494d6b9e47cd144e18cba55f27bbcf2d72220b5fb5f not found: ID does not exist" Apr 23 14:13:43.612137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.612111 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:13:43.613809 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:43.613789 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a8c3e-predictor-7f68d969df-qqgdr"] Apr 23 14:13:45.209870 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:13:45.209829 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" path="/var/lib/kubelet/pods/c88719ce-0c49-469d-b128-da335ca5a48e/volumes" Apr 23 14:17:15.510610 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:17:15.510502 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:17:15.518861 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:17:15.518841 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:21:09.947741 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:09.947706 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:21:09.948314 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:09.948068 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" containerID="cri-o://8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b" gracePeriod=30 Apr 23 14:21:09.948314 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:09.948122 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kube-rbac-proxy" containerID="cri-o://aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b" gracePeriod=30 Apr 23 14:21:10.703547 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703501 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m95lt/must-gather-l82tj"] Apr 23 14:21:10.703891 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703878 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703893 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703907 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703912 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703920 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kube-rbac-proxy" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703926 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kube-rbac-proxy" Apr 23 14:21:10.703940 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703941 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kube-rbac-proxy" Apr 23 14:21:10.704137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703946 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kube-rbac-proxy" Apr 23 14:21:10.704137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.703995 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kserve-container" Apr 23 14:21:10.704137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.704004 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kserve-container" Apr 23 14:21:10.704137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.704011 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c88719ce-0c49-469d-b128-da335ca5a48e" containerName="kube-rbac-proxy" Apr 23 14:21:10.704137 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.704017 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed1d85a-d2dc-4b9d-b42f-de90afc04a20" containerName="kube-rbac-proxy" Apr 23 14:21:10.707051 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.707033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.709815 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.709792 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m95lt\"/\"openshift-service-ca.crt\"" Apr 23 14:21:10.711419 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.711397 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-m95lt\"/\"default-dockercfg-fh6wk\"" Apr 23 14:21:10.711545 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.711473 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m95lt\"/\"kube-root-ca.crt\"" Apr 23 14:21:10.721922 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.721894 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m95lt/must-gather-l82tj"] Apr 23 14:21:10.777519 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.777473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.777692 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.777596 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jzpc\" (UniqueName: \"kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.878440 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.878396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jzpc\" (UniqueName: \"kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.878612 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.878459 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.878748 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.878733 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.887520 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.887495 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jzpc\" (UniqueName: \"kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc\") pod \"must-gather-l82tj\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:10.904271 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.904239 2567 generic.go:358] "Generic (PLEG): container finished" podID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerID="aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b" exitCode=2 Apr 23 14:21:10.904401 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:10.904265 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerDied","Data":"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b"} Apr 23 14:21:11.035653 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:11.035564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:11.158419 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:11.158222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m95lt/must-gather-l82tj"] Apr 23 14:21:11.161166 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:21:11.161135 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaaf4c2d_86b5_470a_b44f_629c61966ce3.slice/crio-25223f13f0e706148c503420736a507e85bc86071749d65b00cb673f2862fbcf WatchSource:0}: Error finding container 25223f13f0e706148c503420736a507e85bc86071749d65b00cb673f2862fbcf: Status 404 returned error can't find the container with id 25223f13f0e706148c503420736a507e85bc86071749d65b00cb673f2862fbcf Apr 23 14:21:11.162892 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:11.162873 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:21:11.907751 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:11.907715 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m95lt/must-gather-l82tj" event={"ID":"baaf4c2d-86b5-470a-b44f-629c61966ce3","Type":"ContainerStarted","Data":"25223f13f0e706148c503420736a507e85bc86071749d65b00cb673f2862fbcf"} Apr 23 14:21:13.419008 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.418981 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:21:13.500936 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.500850 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls\") pod \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " Apr 23 14:21:13.501078 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.500948 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config\") pod \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " Apr 23 14:21:13.501078 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.501007 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g48tk\" (UniqueName: \"kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk\") pod \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\" (UID: \"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4\") " Apr 23 14:21:13.501400 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.501363 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-feffa-kube-rbac-proxy-sar-config") pod "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" (UID: "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4"). InnerVolumeSpecName "success-200-isvc-feffa-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 14:21:13.503346 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.503317 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" (UID: "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:21:13.503531 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.503509 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk" (OuterVolumeSpecName: "kube-api-access-g48tk") pod "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" (UID: "6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4"). InnerVolumeSpecName "kube-api-access-g48tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:21:13.602110 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.602071 2567 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-feffa-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-success-200-isvc-feffa-kube-rbac-proxy-sar-config\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:21:13.602110 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.602107 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g48tk\" (UniqueName: \"kubernetes.io/projected/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-kube-api-access-g48tk\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:21:13.602110 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.602118 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4-proxy-tls\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:21:13.916022 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.915982 2567 generic.go:358] "Generic (PLEG): container finished" podID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerID="8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b" exitCode=0 Apr 23 14:21:13.916022 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.916027 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerDied","Data":"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b"} Apr 23 14:21:13.916285 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.916060 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" event={"ID":"6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4","Type":"ContainerDied","Data":"9375fa7f7577a6363719c8a1cf353e699d95eac28335d27fde257684435a5733"} Apr 23 14:21:13.916285 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.916084 2567 scope.go:117] "RemoveContainer" containerID="aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b" Apr 23 14:21:13.916285 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.916102 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt" Apr 23 14:21:13.941904 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.941839 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:21:13.943458 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:13.943428 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-feffa-predictor-5cff45889-498mt"] Apr 23 14:21:15.210803 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.210770 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" path="/var/lib/kubelet/pods/6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4/volumes" Apr 23 14:21:15.584687 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.584442 2567 scope.go:117] "RemoveContainer" containerID="8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b" Apr 23 14:21:15.805128 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.805097 2567 scope.go:117] "RemoveContainer" containerID="aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b" Apr 23 14:21:15.805479 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:21:15.805456 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b\": container with ID starting with aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b not found: ID does not exist" containerID="aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b" Apr 23 14:21:15.805555 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.805490 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b"} err="failed to get container status \"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b\": rpc error: code = NotFound desc = could not find container \"aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b\": container with ID starting with aae341772717abfc890194aca85508485ff79f56097fd95553ec0b4553aed35b not found: ID does not exist" Apr 23 14:21:15.805555 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.805512 2567 scope.go:117] "RemoveContainer" containerID="8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b" Apr 23 14:21:15.805827 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:21:15.805809 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b\": container with ID starting with 8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b not found: ID does not exist" containerID="8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b" Apr 23 14:21:15.805881 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:15.805832 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b"} err="failed to get container status \"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b\": rpc error: code = NotFound desc = could not find container \"8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b\": container with ID starting with 8a07eb1e8e183ffb507588aef53be9a3756290f24289e9f3c14237e2cbe1218b not found: ID does not exist" Apr 23 14:21:16.930334 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:16.930295 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m95lt/must-gather-l82tj" event={"ID":"baaf4c2d-86b5-470a-b44f-629c61966ce3","Type":"ContainerStarted","Data":"189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529"} Apr 23 14:21:16.930334 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:16.930332 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m95lt/must-gather-l82tj" event={"ID":"baaf4c2d-86b5-470a-b44f-629c61966ce3","Type":"ContainerStarted","Data":"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9"} Apr 23 14:21:16.947994 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:16.947939 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m95lt/must-gather-l82tj" podStartSLOduration=2.245321805 podStartE2EDuration="6.947921148s" podCreationTimestamp="2026-04-23 14:21:10 +0000 UTC" firstStartedPulling="2026-04-23 14:21:11.162994556 +0000 UTC m=+2938.545523663" lastFinishedPulling="2026-04-23 14:21:15.865593899 +0000 UTC m=+2943.248123006" observedRunningTime="2026-04-23 14:21:16.946724418 +0000 UTC m=+2944.329253550" watchObservedRunningTime="2026-04-23 14:21:16.947921148 +0000 UTC m=+2944.330450279" Apr 23 14:21:34.991819 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:34.991782 2567 generic.go:358] "Generic (PLEG): container finished" podID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerID="5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9" exitCode=0 Apr 23 14:21:34.992241 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:34.991844 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m95lt/must-gather-l82tj" event={"ID":"baaf4c2d-86b5-470a-b44f-629c61966ce3","Type":"ContainerDied","Data":"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9"} Apr 23 14:21:34.992241 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:34.992155 2567 scope.go:117] "RemoveContainer" containerID="5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9" Apr 23 14:21:35.786844 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:35.786811 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m95lt_must-gather-l82tj_baaf4c2d-86b5-470a-b44f-629c61966ce3/gather/0.log" Apr 23 14:21:36.483987 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.483952 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cvccm/must-gather-hxgnf"] Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484286 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kube-rbac-proxy" Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484297 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kube-rbac-proxy" Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484305 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484311 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484372 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kube-rbac-proxy" Apr 23 14:21:36.484402 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.484380 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e0ec7bd-e53b-4dbf-923e-d7303fdf8db4" containerName="kserve-container" Apr 23 14:21:36.487647 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.487626 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.490895 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.490873 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"kube-root-ca.crt\"" Apr 23 14:21:36.490895 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.490880 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"openshift-service-ca.crt\"" Apr 23 14:21:36.492303 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.492288 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cvccm\"/\"default-dockercfg-2cmhh\"" Apr 23 14:21:36.496077 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.496057 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/must-gather-hxgnf"] Apr 23 14:21:36.598393 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.598361 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sd6b\" (UniqueName: \"kubernetes.io/projected/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-kube-api-access-8sd6b\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.598566 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.598431 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-must-gather-output\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.699488 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.699448 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sd6b\" (UniqueName: \"kubernetes.io/projected/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-kube-api-access-8sd6b\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.699680 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.699522 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-must-gather-output\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.699837 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.699821 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-must-gather-output\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.707933 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.707906 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sd6b\" (UniqueName: \"kubernetes.io/projected/f64c04dd-8e28-44ab-b3d3-7995171b3fdd-kube-api-access-8sd6b\") pod \"must-gather-hxgnf\" (UID: \"f64c04dd-8e28-44ab-b3d3-7995171b3fdd\") " pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.798132 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.798052 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/must-gather-hxgnf" Apr 23 14:21:36.915994 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.915961 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/must-gather-hxgnf"] Apr 23 14:21:36.918978 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:21:36.918949 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64c04dd_8e28_44ab_b3d3_7995171b3fdd.slice/crio-edb84423a1f916a3dba7aa39d964fa1b88ebf91910d387fe9fe4facd30a26eb5 WatchSource:0}: Error finding container edb84423a1f916a3dba7aa39d964fa1b88ebf91910d387fe9fe4facd30a26eb5: Status 404 returned error can't find the container with id edb84423a1f916a3dba7aa39d964fa1b88ebf91910d387fe9fe4facd30a26eb5 Apr 23 14:21:36.998522 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:36.998481 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/must-gather-hxgnf" event={"ID":"f64c04dd-8e28-44ab-b3d3-7995171b3fdd","Type":"ContainerStarted","Data":"edb84423a1f916a3dba7aa39d964fa1b88ebf91910d387fe9fe4facd30a26eb5"} Apr 23 14:21:38.006729 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:38.006662 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/must-gather-hxgnf" event={"ID":"f64c04dd-8e28-44ab-b3d3-7995171b3fdd","Type":"ContainerStarted","Data":"6aff4a02a34fb6eea684c088bad3317d408378523665e03aa92b9fbcd5dbd7f8"} Apr 23 14:21:39.012213 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:39.012178 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/must-gather-hxgnf" event={"ID":"f64c04dd-8e28-44ab-b3d3-7995171b3fdd","Type":"ContainerStarted","Data":"6c23e8559768975b77c7c1f5239840e855c79115b0b254884a981ac6cff00700"} Apr 23 14:21:39.035222 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:39.035157 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cvccm/must-gather-hxgnf" podStartSLOduration=2.1800820180000002 podStartE2EDuration="3.035136065s" podCreationTimestamp="2026-04-23 14:21:36 +0000 UTC" firstStartedPulling="2026-04-23 14:21:36.920649113 +0000 UTC m=+2964.303178222" lastFinishedPulling="2026-04-23 14:21:37.775703161 +0000 UTC m=+2965.158232269" observedRunningTime="2026-04-23 14:21:39.033285254 +0000 UTC m=+2966.415814388" watchObservedRunningTime="2026-04-23 14:21:39.035136065 +0000 UTC m=+2966.417665176" Apr 23 14:21:39.285784 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:39.285712 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-p54pj_33d32343-1781-48b5-bdcd-a04d2dec36da/global-pull-secret-syncer/0.log" Apr 23 14:21:39.451358 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:39.451330 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-58dnr_226f8750-5ae4-4644-ac04-451c03fc015b/konnectivity-agent/0.log" Apr 23 14:21:39.622421 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:39.622391 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-176.ec2.internal_829c0bd398638defff6c06e99f548781/haproxy/0.log" Apr 23 14:21:41.317674 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.317632 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m95lt/must-gather-l82tj"] Apr 23 14:21:41.318529 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.318490 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-m95lt/must-gather-l82tj" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="copy" containerID="cri-o://189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529" gracePeriod=2 Apr 23 14:21:41.321597 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.321554 2567 status_manager.go:895] "Failed to get status for pod" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" pod="openshift-must-gather-m95lt/must-gather-l82tj" err="pods \"must-gather-l82tj\" is forbidden: User \"system:node:ip-10-0-141-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m95lt\": no relationship found between node 'ip-10-0-141-176.ec2.internal' and this object" Apr 23 14:21:41.324039 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.324015 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m95lt/must-gather-l82tj"] Apr 23 14:21:41.694250 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.693974 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m95lt_must-gather-l82tj_baaf4c2d-86b5-470a-b44f-629c61966ce3/copy/0.log" Apr 23 14:21:41.696350 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.694428 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:41.699712 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.699480 2567 status_manager.go:895] "Failed to get status for pod" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" pod="openshift-must-gather-m95lt/must-gather-l82tj" err="pods \"must-gather-l82tj\" is forbidden: User \"system:node:ip-10-0-141-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m95lt\": no relationship found between node 'ip-10-0-141-176.ec2.internal' and this object" Apr 23 14:21:41.756209 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.754646 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output\") pod \"baaf4c2d-86b5-470a-b44f-629c61966ce3\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " Apr 23 14:21:41.756445 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.756335 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jzpc\" (UniqueName: \"kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc\") pod \"baaf4c2d-86b5-470a-b44f-629c61966ce3\" (UID: \"baaf4c2d-86b5-470a-b44f-629c61966ce3\") " Apr 23 14:21:41.756797 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.756157 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "baaf4c2d-86b5-470a-b44f-629c61966ce3" (UID: "baaf4c2d-86b5-470a-b44f-629c61966ce3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:21:41.760244 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.759476 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc" (OuterVolumeSpecName: "kube-api-access-5jzpc") pod "baaf4c2d-86b5-470a-b44f-629c61966ce3" (UID: "baaf4c2d-86b5-470a-b44f-629c61966ce3"). InnerVolumeSpecName "kube-api-access-5jzpc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:21:41.857771 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.857730 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jzpc\" (UniqueName: \"kubernetes.io/projected/baaf4c2d-86b5-470a-b44f-629c61966ce3-kube-api-access-5jzpc\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:21:41.857771 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:41.857773 2567 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/baaf4c2d-86b5-470a-b44f-629c61966ce3-must-gather-output\") on node \"ip-10-0-141-176.ec2.internal\" DevicePath \"\"" Apr 23 14:21:42.028437 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.028340 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m95lt_must-gather-l82tj_baaf4c2d-86b5-470a-b44f-629c61966ce3/copy/0.log" Apr 23 14:21:42.028739 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.028711 2567 generic.go:358] "Generic (PLEG): container finished" podID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerID="189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529" exitCode=143 Apr 23 14:21:42.028802 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.028779 2567 scope.go:117] "RemoveContainer" containerID="189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529" Apr 23 14:21:42.028921 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.028905 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m95lt/must-gather-l82tj" Apr 23 14:21:42.035398 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.035359 2567 status_manager.go:895] "Failed to get status for pod" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" pod="openshift-must-gather-m95lt/must-gather-l82tj" err="pods \"must-gather-l82tj\" is forbidden: User \"system:node:ip-10-0-141-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m95lt\": no relationship found between node 'ip-10-0-141-176.ec2.internal' and this object" Apr 23 14:21:42.047068 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.047042 2567 scope.go:117] "RemoveContainer" containerID="5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9" Apr 23 14:21:42.061292 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.056162 2567 status_manager.go:895] "Failed to get status for pod" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" pod="openshift-must-gather-m95lt/must-gather-l82tj" err="pods \"must-gather-l82tj\" is forbidden: User \"system:node:ip-10-0-141-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m95lt\": no relationship found between node 'ip-10-0-141-176.ec2.internal' and this object" Apr 23 14:21:42.070020 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.069716 2567 scope.go:117] "RemoveContainer" containerID="189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529" Apr 23 14:21:42.070537 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:21:42.070247 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529\": container with ID starting with 189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529 not found: ID does not exist" containerID="189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529" Apr 23 14:21:42.070537 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.070286 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529"} err="failed to get container status \"189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529\": rpc error: code = NotFound desc = could not find container \"189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529\": container with ID starting with 189612768d941147f0e6b8e412cfed5763af83b06addf00b7d7408dbdc5eb529 not found: ID does not exist" Apr 23 14:21:42.070537 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.070313 2567 scope.go:117] "RemoveContainer" containerID="5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9" Apr 23 14:21:42.070905 ip-10-0-141-176 kubenswrapper[2567]: E0423 14:21:42.070812 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9\": container with ID starting with 5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9 not found: ID does not exist" containerID="5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9" Apr 23 14:21:42.070905 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:42.070851 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9"} err="failed to get container status \"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9\": rpc error: code = NotFound desc = could not find container \"5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9\": container with ID starting with 5642c15586f90b29e65d84740916cdd46e5cb03268de49a85ba56ecb640b65e9 not found: ID does not exist" Apr 23 14:21:43.210846 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.210796 2567 status_manager.go:895] "Failed to get status for pod" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" pod="openshift-must-gather-m95lt/must-gather-l82tj" err="pods \"must-gather-l82tj\" is forbidden: User \"system:node:ip-10-0-141-176.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m95lt\": no relationship found between node 'ip-10-0-141-176.ec2.internal' and this object" Apr 23 14:21:43.211796 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.211748 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" path="/var/lib/kubelet/pods/baaf4c2d-86b5-470a-b44f-629c61966ce3/volumes" Apr 23 14:21:43.255474 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.255438 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bqhpf_def66f8e-82f1-4a4a-9ee8-e407c64ef503/kube-state-metrics/0.log" Apr 23 14:21:43.284190 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.284146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bqhpf_def66f8e-82f1-4a4a-9ee8-e407c64ef503/kube-rbac-proxy-main/0.log" Apr 23 14:21:43.314600 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.314557 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bqhpf_def66f8e-82f1-4a4a-9ee8-e407c64ef503/kube-rbac-proxy-self/0.log" Apr 23 14:21:43.412956 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.412910 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-x877v_a57cae69-bdd0-49ee-b9e0-c6b7a5cb4ff1/monitoring-plugin/0.log" Apr 23 14:21:43.608870 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.608838 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tf56p_a919ee68-d491-466e-afde-2aabacadceda/node-exporter/0.log" Apr 23 14:21:43.629264 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.629219 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tf56p_a919ee68-d491-466e-afde-2aabacadceda/kube-rbac-proxy/0.log" Apr 23 14:21:43.652239 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.652191 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tf56p_a919ee68-d491-466e-afde-2aabacadceda/init-textfile/0.log" Apr 23 14:21:43.778702 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.778667 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/prometheus/0.log" Apr 23 14:21:43.797886 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.797849 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/config-reloader/0.log" Apr 23 14:21:43.823935 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.823901 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/thanos-sidecar/0.log" Apr 23 14:21:43.847534 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.847508 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/kube-rbac-proxy-web/0.log" Apr 23 14:21:43.871866 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.871791 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/kube-rbac-proxy/0.log" Apr 23 14:21:43.898968 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.898934 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/kube-rbac-proxy-thanos/0.log" Apr 23 14:21:43.924538 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.924500 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ea6ce526-eeec-4478-8ddf-ff76d231efb6/init-config-reloader/0.log" Apr 23 14:21:43.954622 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.954588 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ml98b_fdbde0a6-7fe0-4fbb-8a91-766140234fc7/prometheus-operator/0.log" Apr 23 14:21:43.976196 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:43.976162 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ml98b_fdbde0a6-7fe0-4fbb-8a91-766140234fc7/kube-rbac-proxy/0.log" Apr 23 14:21:44.002653 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:44.002617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bt29q_30a8fc37-9e56-46cd-92ab-f911c49adc81/prometheus-operator-admission-webhook/0.log" Apr 23 14:21:46.555874 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.555836 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc"] Apr 23 14:21:46.556894 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.556865 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="copy" Apr 23 14:21:46.557065 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.557050 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="copy" Apr 23 14:21:46.557155 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.557145 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="gather" Apr 23 14:21:46.557260 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.557248 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="gather" Apr 23 14:21:46.557475 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.557464 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="gather" Apr 23 14:21:46.557566 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.557556 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="baaf4c2d-86b5-470a-b44f-629c61966ce3" containerName="copy" Apr 23 14:21:46.560945 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.560920 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.566954 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.566923 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc"] Apr 23 14:21:46.673256 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.673206 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-kwjr8_f6442128-0a86-4621-9b97-053c66c0c77c/volume-data-source-validator/0.log" Apr 23 14:21:46.705148 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.704938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-podres\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.705148 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.705030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-proc\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.705148 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.705065 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-sys\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.705148 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.705090 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp98k\" (UniqueName: \"kubernetes.io/projected/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-kube-api-access-tp98k\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.705148 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.705128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-lib-modules\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806028 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.805934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-proc\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806028 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.805972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-sys\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806028 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.805989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp98k\" (UniqueName: \"kubernetes.io/projected/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-kube-api-access-tp98k\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806325 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-proc\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806325 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806077 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-sys\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806325 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806104 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-lib-modules\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806325 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806168 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-podres\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806325 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806294 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-lib-modules\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.806500 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.806332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-podres\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.814689 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.814663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp98k\" (UniqueName: \"kubernetes.io/projected/8aac9a44-eeba-4964-9c2b-a160c9fa3fd5-kube-api-access-tp98k\") pod \"perf-node-gather-daemonset-s7slc\" (UID: \"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:46.872896 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:46.872862 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:47.027423 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:47.027395 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc"] Apr 23 14:21:47.030935 ip-10-0-141-176 kubenswrapper[2567]: W0423 14:21:47.030904 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8aac9a44_eeba_4964_9c2b_a160c9fa3fd5.slice/crio-7bcefc3e3dd810702f9cbda6e9957a2ea28e6471c706e70f78256f804076b663 WatchSource:0}: Error finding container 7bcefc3e3dd810702f9cbda6e9957a2ea28e6471c706e70f78256f804076b663: Status 404 returned error can't find the container with id 7bcefc3e3dd810702f9cbda6e9957a2ea28e6471c706e70f78256f804076b663 Apr 23 14:21:47.051856 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:47.051822 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" event={"ID":"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5","Type":"ContainerStarted","Data":"7bcefc3e3dd810702f9cbda6e9957a2ea28e6471c706e70f78256f804076b663"} Apr 23 14:21:47.390401 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:47.390374 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2hdsq_7b86f405-4871-42b3-aa86-bde954086fa9/dns/0.log" Apr 23 14:21:47.412812 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:47.412785 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2hdsq_7b86f405-4871-42b3-aa86-bde954086fa9/kube-rbac-proxy/0.log" Apr 23 14:21:47.528490 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:47.528464 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4nhld_f6811f33-1a89-4437-950c-bdb29fcbc2f5/dns-node-resolver/0.log" Apr 23 14:21:48.040460 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:48.040420 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5lgwg_a642a633-45bc-405e-899d-b28d88699e93/node-ca/0.log" Apr 23 14:21:48.055749 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:48.055721 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" event={"ID":"8aac9a44-eeba-4964-9c2b-a160c9fa3fd5","Type":"ContainerStarted","Data":"fc2442c22ff30628039d885beddcbe005b91f8e9e338a20c471eb1945d7ee12d"} Apr 23 14:21:48.055903 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:48.055773 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:48.073620 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:48.073576 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" podStartSLOduration=2.07356296 podStartE2EDuration="2.07356296s" podCreationTimestamp="2026-04-23 14:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:21:48.072129956 +0000 UTC m=+2975.454659079" watchObservedRunningTime="2026-04-23 14:21:48.07356296 +0000 UTC m=+2975.456092089" Apr 23 14:21:49.276205 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:49.276173 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rhpd9_441292d6-55dd-4164-b5f7-2bdf45288757/serve-healthcheck-canary/0.log" Apr 23 14:21:49.819098 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:49.819070 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgjs5_dfe0a743-56d3-48de-a1bd-7f4b26e628aa/kube-rbac-proxy/0.log" Apr 23 14:21:49.841064 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:49.841036 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgjs5_dfe0a743-56d3-48de-a1bd-7f4b26e628aa/exporter/0.log" Apr 23 14:21:49.867218 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:49.867160 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zgjs5_dfe0a743-56d3-48de-a1bd-7f4b26e628aa/extractor/0.log" Apr 23 14:21:52.151483 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:52.151453 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-6fwwt_5d7e9f42-0883-4a48-bbb7-c61087f3818d/manager/0.log" Apr 23 14:21:52.171741 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:52.171709 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-6tb7v_11ac586b-5bb2-482e-b559-f2f6186d7d8e/s3-init/0.log" Apr 23 14:21:54.069992 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:54.069965 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-s7slc" Apr 23 14:21:56.199221 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:56.199142 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8dscm_0fc36607-6fd1-4cc6-b3e3-61494fda3497/migrator/0.log" Apr 23 14:21:56.219567 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:56.219540 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-8dscm_0fc36607-6fd1-4cc6-b3e3-61494fda3497/graceful-termination/0.log" Apr 23 14:21:57.494758 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.494730 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-26s76_92424b25-21d9-42cb-aca4-cad86a5e3dad/kube-multus/0.log" Apr 23 14:21:57.861773 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.861747 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/kube-multus-additional-cni-plugins/0.log" Apr 23 14:21:57.884270 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.884200 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/egress-router-binary-copy/0.log" Apr 23 14:21:57.903458 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.903434 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/cni-plugins/0.log" Apr 23 14:21:57.924173 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.924146 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/bond-cni-plugin/0.log" Apr 23 14:21:57.950915 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.950888 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/routeoverride-cni/0.log" Apr 23 14:21:57.976045 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.976021 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/whereabouts-cni-bincopy/0.log" Apr 23 14:21:57.996796 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:57.996774 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w2589_cc296f8c-211e-4e05-8959-a5aba129cc83/whereabouts-cni/0.log" Apr 23 14:21:58.090595 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.090565 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c246c_69a4531d-2959-43b5-929f-9d7ddf10163b/network-metrics-daemon/0.log" Apr 23 14:21:58.108928 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.108897 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c246c_69a4531d-2959-43b5-929f-9d7ddf10163b/kube-rbac-proxy/0.log" Apr 23 14:21:58.869990 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.869950 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-controller/0.log" Apr 23 14:21:58.886577 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.886552 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/0.log" Apr 23 14:21:58.901643 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.901617 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovn-acl-logging/1.log" Apr 23 14:21:58.919870 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.919845 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/kube-rbac-proxy-node/0.log" Apr 23 14:21:58.942293 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.942261 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:21:58.957040 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.957016 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/northd/0.log" Apr 23 14:21:58.975750 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.975728 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/nbdb/0.log" Apr 23 14:21:58.995733 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:58.995708 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/sbdb/0.log" Apr 23 14:21:59.129480 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:21:59.129390 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kqqbw_b5ea6fad-b66f-4dc4-b956-3f7e7185d225/ovnkube-controller/0.log" Apr 23 14:22:00.807779 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:22:00.807752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-75h8j_a43bd7f9-1505-4e58-acda-ef8e398e302d/network-check-target-container/0.log" Apr 23 14:22:01.763941 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:22:01.763913 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hkkq2_d945d3c6-4041-4013-9864-0b2c325ebccb/iptables-alerter/0.log" Apr 23 14:22:02.450852 ip-10-0-141-176 kubenswrapper[2567]: I0423 14:22:02.450818 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x9nzd_05de8049-2bce-4d70-bdf3-a72ee2c57e37/tuned/0.log"