Apr 24 21:26:14.830268 ip-10-0-134-249 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:26:14.830281 ip-10-0-134-249 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:26:14.830291 ip-10-0-134-249 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:26:14.830598 ip-10-0-134-249 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:26:24.993340 ip-10-0-134-249 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:26:24.993359 ip-10-0-134-249 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3e7a8cb4b6644bec939a90db2ec72d4e -- Apr 24 21:28:35.190805 ip-10-0-134-249 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:28:35.761688 ip-10-0-134-249 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:35.761688 ip-10-0-134-249 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:28:35.761688 ip-10-0-134-249 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:35.761688 ip-10-0-134-249 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:28:35.761688 ip-10-0-134-249 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:28:35.765137 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.765062 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:28:35.767871 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767857 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:35.767871 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767871 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767876 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767879 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767882 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767885 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767888 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767891 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767894 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767896 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767899 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767907 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767911 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767913 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767916 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767918 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767921 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767924 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767926 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767929 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767932 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:35.767938 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767935 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767938 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767942 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767946 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767949 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767953 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767956 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767959 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767962 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767965 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767968 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767971 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767974 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767977 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767980 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767983 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767985 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767988 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767990 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767993 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:35.768427 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767995 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.767998 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768001 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768003 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768006 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768008 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768011 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768013 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768016 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768018 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768021 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768023 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768025 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768028 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768032 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768034 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768037 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768039 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768042 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768044 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:35.768915 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768047 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768049 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768051 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768054 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768056 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768059 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768061 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768064 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768066 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768069 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768071 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768076 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768079 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768083 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768086 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768089 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768091 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768094 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768096 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:35.769399 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768099 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768101 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768104 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768107 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768109 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768112 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768462 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768467 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768470 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768473 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768476 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768478 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768481 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768483 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768486 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768489 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768491 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768494 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768497 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:35.769859 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768499 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768502 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768504 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768507 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768509 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768512 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768514 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768517 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768533 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768538 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768542 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768545 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768547 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768550 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768552 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768555 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768557 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768560 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768562 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768565 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:35.770309 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768568 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768571 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768573 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768576 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768578 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768581 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768584 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768586 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768589 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768591 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768594 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768597 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768599 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768602 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768604 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768607 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768610 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768612 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768615 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768618 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:35.770937 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768621 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768623 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768633 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768635 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768638 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768641 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768643 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768646 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768650 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768653 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768656 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768659 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768663 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768666 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768669 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768672 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768674 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768677 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768679 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:35.771665 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768682 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768684 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768687 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768689 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768692 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768694 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768697 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768699 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768703 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768705 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768708 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768711 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768714 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.768716 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770021 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770030 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770038 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770043 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770047 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770050 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770060 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:28:35.772125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770065 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770068 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770071 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770074 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770078 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770081 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770084 2567 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770087 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770090 2567 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770093 2567 flags.go:64] FLAG: --cloud-config="" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770096 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770099 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770103 2567 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770106 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770109 2567 flags.go:64] FLAG: --config-dir="" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770112 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770115 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770119 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770122 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770125 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770128 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770131 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770134 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770138 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770141 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:28:35.772659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770144 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770148 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770151 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770154 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770157 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770160 2567 flags.go:64] FLAG: --enable-server="true" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770164 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770168 2567 flags.go:64] FLAG: --event-burst="100" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770172 2567 flags.go:64] FLAG: --event-qps="50" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770175 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770178 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770180 2567 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770184 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770187 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770190 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770193 2567 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770196 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770199 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770202 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770204 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770207 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770210 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770213 2567 flags.go:64] FLAG: --feature-gates="" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770217 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770220 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:28:35.773250 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770223 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770226 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770233 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770236 2567 flags.go:64] FLAG: --help="false" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770239 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-134-249.ec2.internal" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770242 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770245 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770248 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770251 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770255 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770258 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770261 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770264 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770267 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770270 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770273 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770276 2567 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770279 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770282 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770285 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770288 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770291 2567 flags.go:64] FLAG: --lock-file="" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770294 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770311 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:28:35.773884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770315 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770321 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770324 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770327 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770331 2567 flags.go:64] FLAG: --logging-format="text" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770334 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770337 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770340 2567 flags.go:64] FLAG: --manifest-url="" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770343 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770347 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770351 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770355 2567 flags.go:64] FLAG: --max-pods="110" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770358 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770361 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770364 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770367 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770370 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770373 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770376 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770383 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770386 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770389 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770393 2567 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:28:35.774460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770396 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770401 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770404 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770407 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770410 2567 flags.go:64] FLAG: --port="10250" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770413 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770416 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01f74cc27f4ed60ba" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770419 2567 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770422 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770424 2567 flags.go:64] FLAG: --register-node="true" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770427 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770430 2567 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770433 2567 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770436 2567 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770439 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770442 2567 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770445 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770448 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770451 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770454 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770457 2567 flags.go:64] FLAG: --runonce="false" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770460 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770463 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770466 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770469 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770472 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:28:35.775031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770474 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770477 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770480 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770483 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770486 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770489 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770492 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770496 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770499 2567 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770501 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770507 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770510 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770513 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770516 2567 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770532 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770535 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770538 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770541 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770544 2567 flags.go:64] FLAG: --v="2" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770556 2567 flags.go:64] FLAG: --version="false" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770560 2567 flags.go:64] FLAG: --vmodule="" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770565 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.770568 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770759 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:35.775666 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770765 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770768 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770771 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770774 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770777 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770780 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.770784 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771733 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771871 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771883 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771887 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771890 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771893 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771896 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771898 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771901 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771903 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771907 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771909 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771912 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:35.776225 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771915 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771917 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771920 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771923 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771926 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771929 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771931 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771934 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771936 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771940 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771943 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771951 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771954 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771957 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771959 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771963 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771966 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771968 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771971 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771973 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:35.776767 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771977 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771980 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771982 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771985 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771988 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771990 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771993 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771995 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.771998 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772001 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772004 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772006 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772009 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772011 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772014 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772016 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772019 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772022 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772024 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772027 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:35.777267 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772029 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772032 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772034 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772037 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772040 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772043 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772045 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772048 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772050 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772053 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772055 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772058 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772064 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772066 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772069 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772072 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772074 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772077 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772082 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:35.777783 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772085 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772089 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772094 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772097 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772099 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.772102 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.772108 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.778050 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.778066 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778110 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778115 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778118 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778121 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778124 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778127 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778130 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:35.778228 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778133 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778136 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778139 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778142 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778145 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778147 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778150 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778153 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778155 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778158 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778160 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778163 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778166 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778169 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778172 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778175 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778177 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778180 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778182 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778185 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:35.778646 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778188 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778190 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778193 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778196 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778199 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778202 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778205 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778207 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778210 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778212 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778215 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778218 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778221 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778223 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778226 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778229 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778232 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778235 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778238 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:35.779123 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778240 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778243 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778246 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778249 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778251 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778255 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778259 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778263 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778266 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778269 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778271 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778274 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778276 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778279 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778282 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778284 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778287 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778291 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778293 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:35.779615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778296 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778300 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778303 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778306 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778309 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778312 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778315 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778317 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778320 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778323 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778325 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778328 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778331 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778333 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778336 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778338 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778341 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778343 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778346 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778348 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:35.780104 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778350 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.778356 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778447 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778451 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778454 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778457 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778460 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778463 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778466 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778469 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778472 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778475 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778477 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778480 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778482 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:28:35.780615 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778485 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778487 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778490 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778492 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778495 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778497 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778500 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778502 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778505 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778507 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778509 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778512 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778514 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778517 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778533 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778536 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778539 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778541 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778544 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:28:35.780991 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778547 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778550 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778554 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778557 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778559 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778562 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778564 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778567 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778570 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778573 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778575 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778578 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778580 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778582 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778585 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778587 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778590 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778592 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778595 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778597 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:28:35.781456 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778600 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778603 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778607 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778610 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778613 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778616 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778619 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778621 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778624 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778627 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778629 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778632 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778634 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778637 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778640 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778643 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778645 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778648 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778651 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:28:35.782005 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778653 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778656 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778659 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778661 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778664 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778667 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778669 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778672 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778674 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778677 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778679 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778682 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778684 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778687 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:35.778690 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.778695 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:28:35.782481 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.779405 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:28:35.783362 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.783348 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:28:35.784144 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.784133 2567 server.go:1019] "Starting client certificate rotation" Apr 24 21:28:35.784256 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.784240 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:35.784290 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.784283 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:28:35.808021 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.808005 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:35.814908 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.814891 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:28:35.828455 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.828439 2567 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:28:35.833863 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.833845 2567 log.go:25] "Validated CRI v1 image API" Apr 24 21:28:35.836406 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.836393 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:28:35.837006 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.836988 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:35.840671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.840629 2567 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 88878f78-3608-4ae3-94f9-5b180691df95:/dev/nvme0n1p4 e03370e8-428b-47de-b21f-3146e793bcb2:/dev/nvme0n1p3] Apr 24 21:28:35.840759 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.840672 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:28:35.847125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.847023 2567 manager.go:217] Machine: {Timestamp:2026-04-24 21:28:35.84518768 +0000 UTC m=+0.387139269 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099974 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d59ec7c4422c9159dea176131b614 SystemUUID:ec2d59ec-7c44-22c9-159d-ea176131b614 BootID:3e7a8cb4-b664-4bec-939a-90db2ec72d4e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ed:7a:00:9e:cd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ed:7a:00:9e:cd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:90:63:60:d4:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:28:35.847125 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.847115 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:28:35.847246 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.847186 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:28:35.848099 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.848078 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:28:35.848238 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.848102 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-249.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:28:35.848280 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.848247 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:28:35.848280 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.848256 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:28:35.848280 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.848268 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:35.849302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.849292 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:28:35.850555 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.850545 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:35.850801 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.850792 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:28:35.853796 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.853786 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:28:35.853837 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.853800 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:28:35.853837 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.853812 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:28:35.853837 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.853820 2567 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:28:35.853837 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.853829 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:28:35.855426 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.855412 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:35.855468 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.855439 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:28:35.858693 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.858679 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:28:35.860094 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.860081 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:28:35.861820 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861808 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861826 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861833 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861840 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861846 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861852 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861857 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:28:35.861864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861863 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:28:35.862036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861869 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:28:35.862036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861876 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:28:35.862036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861891 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:28:35.862036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.861899 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:28:35.862708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.862699 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:28:35.862738 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.862710 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:28:35.865677 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.865658 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-249.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:28:35.865812 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.865798 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-249.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:28:35.865996 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.865979 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:28:35.866671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.866659 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:28:35.866703 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.866697 2567 server.go:1295] "Started kubelet" Apr 24 21:28:35.866805 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.866781 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:28:35.866914 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.866876 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:28:35.866967 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.866929 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:28:35.867402 ip-10-0-134-249 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:28:35.868026 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.868005 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:28:35.868691 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.868678 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:28:35.872918 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.872902 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:35.873310 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.872468 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-249.ec2.internal.18a96837929dbaa5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-249.ec2.internal,UID:ip-10-0-134-249.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-249.ec2.internal,},FirstTimestamp:2026-04-24 21:28:35.866671781 +0000 UTC m=+0.408623371,LastTimestamp:2026-04-24 21:28:35.866671781 +0000 UTC m=+0.408623371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-249.ec2.internal,}" Apr 24 21:28:35.873414 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.873369 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:28:35.873972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.873956 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:28:35.874088 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874076 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:28:35.874270 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874258 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:28:35.874375 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874259 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:28:35.874466 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874454 2567 factory.go:55] Registering systemd factory Apr 24 21:28:35.874581 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874567 2567 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:28:35.874696 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874295 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:28:35.874792 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.874780 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:28:35.875185 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.874930 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:35.875965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.875949 2567 factory.go:153] Registering CRI-O factory Apr 24 21:28:35.876083 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.876073 2567 factory.go:223] Registration of the crio container factory successfully Apr 24 21:28:35.876161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.876145 2567 factory.go:103] Registering Raw factory Apr 24 21:28:35.876161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.876165 2567 manager.go:1196] Started watching for new ooms in manager Apr 24 21:28:35.876626 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.876612 2567 manager.go:319] Starting recovery of all containers Apr 24 21:28:35.876777 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.876737 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:28:35.881823 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.881797 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:28:35.881926 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.881818 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-249.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:28:35.888989 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.888894 2567 manager.go:324] Recovery completed Apr 24 21:28:35.891164 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.891148 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mfrv9" Apr 24 21:28:35.892885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.892873 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:35.895170 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895155 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:35.895252 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895197 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:35.895252 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895214 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:35.895671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895650 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:28:35.895671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895664 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:28:35.895749 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.895680 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:28:35.896793 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.896735 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-249.ec2.internal.18a968379450916c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-249.ec2.internal,UID:ip-10-0-134-249.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-249.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-249.ec2.internal,},FirstTimestamp:2026-04-24 21:28:35.895169388 +0000 UTC m=+0.437120983,LastTimestamp:2026-04-24 21:28:35.895169388 +0000 UTC m=+0.437120983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-249.ec2.internal,}" Apr 24 21:28:35.897378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.897355 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mfrv9" Apr 24 21:28:35.898643 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.898631 2567 policy_none.go:49] "None policy: Start" Apr 24 21:28:35.898677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.898648 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:28:35.898677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.898659 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.933997 2567 manager.go:341] "Starting Device Plugin manager" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.934041 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934055 2567 server.go:85] "Starting device plugin registration server" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934255 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934267 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934345 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934423 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.934432 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.934951 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:28:35.947711 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.934985 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:35.975940 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.975900 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:28:35.977066 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.977052 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:28:35.977121 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.977077 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:28:35.977121 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.977095 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:28:35.977121 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.977104 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:28:35.977227 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:35.977140 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:28:35.979821 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:35.979806 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:36.035000 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.034984 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:36.035982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.035966 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:36.036067 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.035992 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:36.036067 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.036002 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:36.036067 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.036028 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.045234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.045217 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.045301 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.045237 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-249.ec2.internal\": node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.059180 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.059161 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.077945 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.077925 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal"] Apr 24 21:28:36.078025 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.077991 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:36.078782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.078767 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:36.078866 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.078798 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:36.078866 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.078813 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:36.080092 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080078 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:36.080219 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080203 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.080278 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080246 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:36.080713 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080698 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:36.080782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080727 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:36.080782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080742 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:36.080782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080704 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:36.080782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080770 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:36.080927 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.080785 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:36.082473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.082456 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.082518 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.082489 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:28:36.083142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.083126 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:28:36.083211 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.083148 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:28:36.083211 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.083162 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:28:36.107126 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.107106 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-249.ec2.internal\" not found" node="ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.111376 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.111359 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-249.ec2.internal\" not found" node="ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.159876 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.159856 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.175887 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.175868 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e6f541dfff28fedc7fdca7f3c5d9590-config\") pod \"kube-apiserver-proxy-ip-10-0-134-249.ec2.internal\" (UID: \"6e6f541dfff28fedc7fdca7f3c5d9590\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.175946 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.175893 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.175946 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.175909 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.260954 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.260937 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.276307 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.276307 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.276387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e6f541dfff28fedc7fdca7f3c5d9590-config\") pod \"kube-apiserver-proxy-ip-10-0-134-249.ec2.internal\" (UID: \"6e6f541dfff28fedc7fdca7f3c5d9590\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.276387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e6f541dfff28fedc7fdca7f3c5d9590-config\") pod \"kube-apiserver-proxy-ip-10-0-134-249.ec2.internal\" (UID: \"6e6f541dfff28fedc7fdca7f3c5d9590\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.276387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276351 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.276387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.276369 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2e7dfa06eeb84e40d6ef86874d30644-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal\" (UID: \"e2e7dfa06eeb84e40d6ef86874d30644\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.361689 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.361671 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.409178 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.409156 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.414676 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.414661 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:36.462110 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.462085 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.562632 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.562586 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.663147 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.663125 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.763670 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.763648 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.785156 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.785138 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:28:36.785283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.785268 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:28:36.864717 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.864696 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.874157 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.874139 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:28:36.884602 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.884071 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:28:36.899057 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.899025 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:23:35 +0000 UTC" deadline="2027-09-25 14:03:56.310823622 +0000 UTC" Apr 24 21:28:36.899195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.899062 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12448h35m19.411765402s" Apr 24 21:28:36.913259 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.913237 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tvt5c" Apr 24 21:28:36.924916 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.924898 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tvt5c" Apr 24 21:28:36.965147 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:36.965127 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-249.ec2.internal\" not found" Apr 24 21:28:36.981307 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:36.981277 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e7dfa06eeb84e40d6ef86874d30644.slice/crio-525b7931f30e846f7c296464b63bb9110e2ab148a355cf2fa45749a31a21be85 WatchSource:0}: Error finding container 525b7931f30e846f7c296464b63bb9110e2ab148a355cf2fa45749a31a21be85: Status 404 returned error can't find the container with id 525b7931f30e846f7c296464b63bb9110e2ab148a355cf2fa45749a31a21be85 Apr 24 21:28:36.981863 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:36.981845 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6f541dfff28fedc7fdca7f3c5d9590.slice/crio-1222870adc468290216c3fb87e0fc3eabfddd77aee7a5a76b558d1bfb7b73d2f WatchSource:0}: Error finding container 1222870adc468290216c3fb87e0fc3eabfddd77aee7a5a76b558d1bfb7b73d2f: Status 404 returned error can't find the container with id 1222870adc468290216c3fb87e0fc3eabfddd77aee7a5a76b558d1bfb7b73d2f Apr 24 21:28:36.985451 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:36.985432 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:28:37.036629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.036607 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:37.074708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.074691 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" Apr 24 21:28:37.086429 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.086410 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:37.089551 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.089537 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:37.090371 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.090359 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" Apr 24 21:28:37.100211 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.100195 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:28:37.138236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.138193 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:37.855296 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.855267 2567 apiserver.go:52] "Watching apiserver" Apr 24 21:28:37.863588 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.863560 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:28:37.864004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.863975 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-mxwm9","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp","openshift-multus/multus-additional-cni-plugins-x7f8g","openshift-multus/multus-vjdz9","openshift-network-operator/iptables-alerter-q6trf","openshift-ovn-kubernetes/ovnkube-node-qr6dh","kube-system/konnectivity-agent-g6b6t","kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal","openshift-cluster-node-tuning-operator/tuned-pksbn","openshift-dns/node-resolver-xdhfk","openshift-image-registry/node-ca-qj6pk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal","openshift-multus/network-metrics-daemon-fnmhx"] Apr 24 21:28:37.866674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.866647 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.868329 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.868175 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.869984 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.869563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.870355 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870328 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:28:37.870470 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870443 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v6jc2\"" Apr 24 21:28:37.870672 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870650 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.870742 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870690 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.870908 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870888 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4nqjp\"" Apr 24 21:28:37.871010 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870909 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:28:37.871010 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.870935 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.871010 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.871000 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:28:37.872228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872206 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.872324 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872238 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:28:37.872385 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872327 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.872843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872715 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zlgt8\"" Apr 24 21:28:37.872843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872778 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.872990 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.872903 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:28:37.873407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.873250 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:28:37.873494 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.873438 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j6dh8\"" Apr 24 21:28:37.874274 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.874255 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:28:37.876219 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.876062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.876219 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.876102 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:37.876219 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.876179 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:37.876219 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.876194 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f4t5m\"" Apr 24 21:28:37.877157 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.877128 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:28:37.877483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.877446 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.877591 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.877543 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.878458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.878437 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881023 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881142 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881479 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881602 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881667 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881693 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:28:37.881754 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881604 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mk6gs\"" Apr 24 21:28:37.882130 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.881766 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-hf4gz\"" Apr 24 21:28:37.882130 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.882067 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.882894 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.882780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.883601 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.883239 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:37.883915 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.883727 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pgp7d\"" Apr 24 21:28:37.884102 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884073 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-os-release\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.884182 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884121 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-netns\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884182 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884145 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-kubelet\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884182 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884168 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-bin\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884211 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxsf\" (UniqueName: \"kubernetes.io/projected/54414862-ab41-4d71-8c87-40ce2fe45ac4-kube-api-access-qsxsf\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-node-log\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884289 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-registration-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884335 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-os-release\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884360 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54414862-ab41-4d71-8c87-40ce2fe45ac4-host-slash\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-netns\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-socket-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884442 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-var-lib-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884464 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.884492 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884487 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cnibin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eea02f54-e68c-472a-b138-f0b60cf3f2b8-konnectivity-ca\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884552 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884576 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t657h\" (UniqueName: \"kubernetes.io/projected/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-kube-api-access-t657h\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884601 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-k8s-cni-cncf-io\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-bin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-daemon-config\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884672 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj6wg\" (UniqueName: \"kubernetes.io/projected/14209990-6df9-445b-a825-ae10b8c6b84d-kube-api-access-sj6wg\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-etc-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eea02f54-e68c-472a-b138-f0b60cf3f2b8-agent-certs\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-sys-fs\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-ovn\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.884997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.884898 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885048 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-env-overrides\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885108 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cni-binary-copy\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885122 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885137 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckwx\" (UniqueName: \"kubernetes.io/projected/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-kube-api-access-2ckwx\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-log-socket\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885390 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14209990-6df9-445b-a825-ae10b8c6b84d-ovn-node-metrics-cert\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-kubelet\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885490 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-slash\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885516 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885567 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-etc-kubernetes\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885636 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-system-cni-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.885674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885662 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54414862-ab41-4d71-8c87-40ce2fe45ac4-iptables-alerter-script\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-netd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-config\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885791 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smgvz\" (UniqueName: \"kubernetes.io/projected/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kube-api-access-smgvz\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-socket-dir-parent\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885955 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885962 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-conf-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.885990 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-multus-certs\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.886013 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886036 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cnibin\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886062 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-device-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886104 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-system-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-multus\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.886397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-hostroot\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.887230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886174 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-systemd-units\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.887230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-systemd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.887230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.887230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.886248 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-script-lib\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.889150 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.888969 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:28:37.894323 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.894261 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.894426 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.894385 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:28:37.894426 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.894398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:28:37.894554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.894463 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zsbhm\"" Apr 24 21:28:37.925589 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.925560 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:36 +0000 UTC" deadline="2027-12-30 12:06:48.80219275 +0000 UTC" Apr 24 21:28:37.925589 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.925588 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14750h38m10.87660842s" Apr 24 21:28:37.977662 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.976882 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:28:37.981987 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.981934 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" event={"ID":"6e6f541dfff28fedc7fdca7f3c5d9590","Type":"ContainerStarted","Data":"1222870adc468290216c3fb87e0fc3eabfddd77aee7a5a76b558d1bfb7b73d2f"} Apr 24 21:28:37.983264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.983237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" event={"ID":"e2e7dfa06eeb84e40d6ef86874d30644","Type":"ContainerStarted","Data":"525b7931f30e846f7c296464b63bb9110e2ab148a355cf2fa45749a31a21be85"} Apr 24 21:28:37.986578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986553 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-device-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.986662 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986593 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-system-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986662 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-multus\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986662 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-hostroot\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986662 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986645 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-device-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986710 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-multus\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986723 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-hostroot\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986722 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-system-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986747 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-systemd-units\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986782 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-systemd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-script-lib\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986838 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-systemd-units\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.986869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986860 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysconfig\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986879 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-systemd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986885 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-conf\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986938 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-systemd\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986968 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-sys\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.986992 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-host\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-tuned\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-os-release\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987089 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-netns\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987113 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-kubelet\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-bin\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987174 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-kubelet\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987141 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-os-release\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxsf\" (UniqueName: \"kubernetes.io/projected/54414862-ab41-4d71-8c87-40ce2fe45ac4-kube-api-access-qsxsf\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-netns\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-bin\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.987283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-node-log\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-node-log\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-modprobe-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987291 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-registration-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987340 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-registration-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987371 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-os-release\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-os-release\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987449 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54414862-ab41-4d71-8c87-40ce2fe45ac4-host-slash\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987469 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-script-lib\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54414862-ab41-4d71-8c87-40ce2fe45ac4-host-slash\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-netns\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987556 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bv9f\" (UniqueName: \"kubernetes.io/projected/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-kube-api-access-2bv9f\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-socket-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-var-lib-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-run-netns\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987628 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-var-lib-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-lib-modules\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987717 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987733 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cnibin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987749 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-var-lib-kubelet\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987754 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-socket-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987766 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c0ee91-65bf-4825-9def-c62c4806cc59-tmp-dir\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987790 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxpvs\" (UniqueName: \"kubernetes.io/projected/f3c0ee91-65bf-4825-9def-c62c4806cc59-kube-api-access-vxpvs\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eea02f54-e68c-472a-b138-f0b60cf3f2b8-konnectivity-ca\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987838 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t657h\" (UniqueName: \"kubernetes.io/projected/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-kube-api-access-t657h\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-k8s-cni-cncf-io\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987872 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-bin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987891 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-daemon-config\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj6wg\" (UniqueName: \"kubernetes.io/projected/14209990-6df9-445b-a825-ae10b8c6b84d-kube-api-access-sj6wg\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-etc-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eea02f54-e68c-472a-b138-f0b60cf3f2b8-agent-certs\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-sys-fs\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.988965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.987989 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-ovn\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-env-overrides\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988036 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cni-binary-copy\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988052 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckwx\" (UniqueName: \"kubernetes.io/projected/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-kube-api-access-2ckwx\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-log-socket\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988081 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14209990-6df9-445b-a825-ae10b8c6b84d-ovn-node-metrics-cert\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988098 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-cni-bin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-sys-fs\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cnibin\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988106 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-kubernetes\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-ovn\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988205 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-log-socket\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-daemon-config\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988576 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-env-overrides\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.989708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3c0ee91-65bf-4825-9def-c62c4806cc59-hosts-file\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988607 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988621 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-kubelet\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-var-lib-kubelet\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-cni-binary-copy\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988725 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-k8s-cni-cncf-io\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-etc-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-slash\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988792 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988812 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-slash\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988823 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988841 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988847 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ggq8\" (UniqueName: \"kubernetes.io/projected/755f7e1a-7e39-472a-9a15-3ecbce3571b8-kube-api-access-5ggq8\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988871 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-etc-kubernetes\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-tmp\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-etc-kubernetes\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.990458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58290685-bbba-48e3-936e-34b4f4d27034-serviceca\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-cni-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.988991 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-system-cni-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989017 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54414862-ab41-4d71-8c87-40ce2fe45ac4-iptables-alerter-script\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-system-cni-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989042 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989063 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-netd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989080 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-config\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989096 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eea02f54-e68c-472a-b138-f0b60cf3f2b8-konnectivity-ca\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989118 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smgvz\" (UniqueName: \"kubernetes.io/projected/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kube-api-access-smgvz\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989123 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-run-openvswitch\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989158 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989185 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989215 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-socket-dir-parent\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-conf-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989242 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:37.991255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989263 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-run\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989311 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58290685-bbba-48e3-936e-34b4f4d27034-host\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-multus-certs\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cnibin\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989414 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79lfj\" (UniqueName: \"kubernetes.io/projected/58290685-bbba-48e3-936e-34b4f4d27034-kube-api-access-79lfj\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989432 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989549 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-socket-dir-parent\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989570 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cnibin\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989582 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54414862-ab41-4d71-8c87-40ce2fe45ac4-iptables-alerter-script\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989589 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-host-run-multus-certs\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989606 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14209990-6df9-445b-a825-ae10b8c6b84d-host-cni-netd\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-multus-conf-dir\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989725 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.989826 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:37.992008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.990099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14209990-6df9-445b-a825-ae10b8c6b84d-ovnkube-config\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.992615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.992215 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14209990-6df9-445b-a825-ae10b8c6b84d-ovn-node-metrics-cert\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:37.992615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:37.992344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eea02f54-e68c-472a-b138-f0b60cf3f2b8-agent-certs\") pod \"konnectivity-agent-g6b6t\" (UID: \"eea02f54-e68c-472a-b138-f0b60cf3f2b8\") " pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:37.998497 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.998366 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:37.998497 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.998387 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:37.998497 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.998401 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:37.998497 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:37.998499 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:38.498448216 +0000 UTC m=+3.040399811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:38.000789 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.000756 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t657h\" (UniqueName: \"kubernetes.io/projected/13ab36a9-43a5-4f65-a1cc-a42a4f3c183d-kube-api-access-t657h\") pod \"multus-additional-cni-plugins-x7f8g\" (UID: \"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d\") " pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:38.000946 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.000913 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smgvz\" (UniqueName: \"kubernetes.io/projected/2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5-kube-api-access-smgvz\") pod \"aws-ebs-csi-driver-node-hgldp\" (UID: \"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:38.002100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.001867 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxsf\" (UniqueName: \"kubernetes.io/projected/54414862-ab41-4d71-8c87-40ce2fe45ac4-kube-api-access-qsxsf\") pod \"iptables-alerter-q6trf\" (UID: \"54414862-ab41-4d71-8c87-40ce2fe45ac4\") " pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:38.002100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.001965 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckwx\" (UniqueName: \"kubernetes.io/projected/7e0b111b-0c8c-40e0-838f-d5768a4fd67a-kube-api-access-2ckwx\") pod \"multus-vjdz9\" (UID: \"7e0b111b-0c8c-40e0-838f-d5768a4fd67a\") " pod="openshift-multus/multus-vjdz9" Apr 24 21:28:38.002100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.002062 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj6wg\" (UniqueName: \"kubernetes.io/projected/14209990-6df9-445b-a825-ae10b8c6b84d-kube-api-access-sj6wg\") pod \"ovnkube-node-qr6dh\" (UID: \"14209990-6df9-445b-a825-ae10b8c6b84d\") " pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:38.032173 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.032148 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:28:38.090146 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-kubernetes\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090257 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090157 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3c0ee91-65bf-4825-9def-c62c4806cc59-hosts-file\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.090257 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090182 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:38.090257 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ggq8\" (UniqueName: \"kubernetes.io/projected/755f7e1a-7e39-472a-9a15-3ecbce3571b8-kube-api-access-5ggq8\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090257 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-kubernetes\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090257 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090248 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-tmp\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3c0ee91-65bf-4825-9def-c62c4806cc59-hosts-file\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58290685-bbba-48e3-936e-34b4f4d27034-serviceca\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-run\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58290685-bbba-48e3-936e-34b4f4d27034-host\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.090391 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79lfj\" (UniqueName: \"kubernetes.io/projected/58290685-bbba-48e3-936e-34b4f4d27034-kube-api-access-79lfj\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090454 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysconfig\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.090480 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.090469 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:38.59045014 +0000 UTC m=+3.132401748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090500 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-conf\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090557 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-systemd\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-sys\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-host\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-tuned\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090646 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-systemd\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-modprobe-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090667 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-conf\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090693 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bv9f\" (UniqueName: \"kubernetes.io/projected/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-kube-api-access-2bv9f\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/58290685-bbba-48e3-936e-34b4f4d27034-serviceca\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-host\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58290685-bbba-48e3-936e-34b4f4d27034-host\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-sys\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090502 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysconfig\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-lib-modules\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-var-lib-kubelet\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.091623 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c0ee91-65bf-4825-9def-c62c4806cc59-tmp-dir\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-lib-modules\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090918 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxpvs\" (UniqueName: \"kubernetes.io/projected/f3c0ee91-65bf-4825-9def-c62c4806cc59-kube-api-access-vxpvs\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090891 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-sysctl-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.090931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-var-lib-kubelet\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.091031 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-modprobe-d\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.091191 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3c0ee91-65bf-4825-9def-c62c4806cc59-tmp-dir\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.092558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.091239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/755f7e1a-7e39-472a-9a15-3ecbce3571b8-run\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.092893 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.092689 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-tmp\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.095046 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.093080 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/755f7e1a-7e39-472a-9a15-3ecbce3571b8-etc-tuned\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.102236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.102209 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxpvs\" (UniqueName: \"kubernetes.io/projected/f3c0ee91-65bf-4825-9def-c62c4806cc59-kube-api-access-vxpvs\") pod \"node-resolver-xdhfk\" (UID: \"f3c0ee91-65bf-4825-9def-c62c4806cc59\") " pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.102429 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.102413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ggq8\" (UniqueName: \"kubernetes.io/projected/755f7e1a-7e39-472a-9a15-3ecbce3571b8-kube-api-access-5ggq8\") pod \"tuned-pksbn\" (UID: \"755f7e1a-7e39-472a-9a15-3ecbce3571b8\") " pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.102785 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.102768 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bv9f\" (UniqueName: \"kubernetes.io/projected/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-kube-api-access-2bv9f\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:38.103083 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.103062 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79lfj\" (UniqueName: \"kubernetes.io/projected/58290685-bbba-48e3-936e-34b4f4d27034-kube-api-access-79lfj\") pod \"node-ca-qj6pk\" (UID: \"58290685-bbba-48e3-936e-34b4f4d27034\") " pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.188152 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.188072 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:38.195920 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.195896 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" Apr 24 21:28:38.204472 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.204453 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" Apr 24 21:28:38.212113 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.212093 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vjdz9" Apr 24 21:28:38.218663 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.218646 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q6trf" Apr 24 21:28:38.226228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.226210 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:28:38.233728 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.233711 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pksbn" Apr 24 21:28:38.242218 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.242200 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xdhfk" Apr 24 21:28:38.247650 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.247633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qj6pk" Apr 24 21:28:38.594888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.594822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:38.594888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.594859 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:38.595097 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.594942 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:38.595097 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.594989 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:39.594974704 +0000 UTC m=+4.136926286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:38.595097 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.595013 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:38.595097 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.595033 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:38.595097 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.595046 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:38.595272 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:38.595117 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:39.595101463 +0000 UTC m=+4.137053043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:38.638212 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.638120 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ab36a9_43a5_4f65_a1cc_a42a4f3c183d.slice/crio-8de7d3275dd7ce8fc6ee19cdec8a2aa8cd2e029ee63f1eb5d2b1ea84f3e833f5 WatchSource:0}: Error finding container 8de7d3275dd7ce8fc6ee19cdec8a2aa8cd2e029ee63f1eb5d2b1ea84f3e833f5: Status 404 returned error can't find the container with id 8de7d3275dd7ce8fc6ee19cdec8a2aa8cd2e029ee63f1eb5d2b1ea84f3e833f5 Apr 24 21:28:38.639367 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.639343 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14209990_6df9_445b_a825_ae10b8c6b84d.slice/crio-34a338b0ddceb6b3cf45b787b5e3810421e93c5016b4f0bb79a071eb7b04ef90 WatchSource:0}: Error finding container 34a338b0ddceb6b3cf45b787b5e3810421e93c5016b4f0bb79a071eb7b04ef90: Status 404 returned error can't find the container with id 34a338b0ddceb6b3cf45b787b5e3810421e93c5016b4f0bb79a071eb7b04ef90 Apr 24 21:28:38.640102 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.639960 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeea02f54_e68c_472a_b138_f0b60cf3f2b8.slice/crio-e6c0c257e44f3ce1d7e34a39d262e07f6bc9832e01b2877948871cbcd3799844 WatchSource:0}: Error finding container e6c0c257e44f3ce1d7e34a39d262e07f6bc9832e01b2877948871cbcd3799844: Status 404 returned error can't find the container with id e6c0c257e44f3ce1d7e34a39d262e07f6bc9832e01b2877948871cbcd3799844 Apr 24 21:28:38.644465 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.644435 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755f7e1a_7e39_472a_9a15_3ecbce3571b8.slice/crio-9187bcc1acd109e3a9e385f1d22303773a359d4c1ecc401b5cfd1ffa316f4b39 WatchSource:0}: Error finding container 9187bcc1acd109e3a9e385f1d22303773a359d4c1ecc401b5cfd1ffa316f4b39: Status 404 returned error can't find the container with id 9187bcc1acd109e3a9e385f1d22303773a359d4c1ecc401b5cfd1ffa316f4b39 Apr 24 21:28:38.645241 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.645218 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54414862_ab41_4d71_8c87_40ce2fe45ac4.slice/crio-726279270ebe7aba0f16c090750eca58b0f4ea0befc4a9852300740ea8d4b930 WatchSource:0}: Error finding container 726279270ebe7aba0f16c090750eca58b0f4ea0befc4a9852300740ea8d4b930: Status 404 returned error can't find the container with id 726279270ebe7aba0f16c090750eca58b0f4ea0befc4a9852300740ea8d4b930 Apr 24 21:28:38.646822 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.646570 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0b111b_0c8c_40e0_838f_d5768a4fd67a.slice/crio-37d05070b917820fe89033172d9502fc06cee54e1c6d87b126259ebfeda2c924 WatchSource:0}: Error finding container 37d05070b917820fe89033172d9502fc06cee54e1c6d87b126259ebfeda2c924: Status 404 returned error can't find the container with id 37d05070b917820fe89033172d9502fc06cee54e1c6d87b126259ebfeda2c924 Apr 24 21:28:38.648059 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.647552 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c0ee91_65bf_4825_9def_c62c4806cc59.slice/crio-e935d85b17fad2857099b54c5bd4e374f24e21475fc8beccc19da255ba367b5f WatchSource:0}: Error finding container e935d85b17fad2857099b54c5bd4e374f24e21475fc8beccc19da255ba367b5f: Status 404 returned error can't find the container with id e935d85b17fad2857099b54c5bd4e374f24e21475fc8beccc19da255ba367b5f Apr 24 21:28:38.648452 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.648434 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee6fabb_9f77_4f9a_a75d_586f99d9c0b5.slice/crio-4123013566d3bf816a4fee2e16d53831d7df919c9d8279b3e42bfd7453b1cdac WatchSource:0}: Error finding container 4123013566d3bf816a4fee2e16d53831d7df919c9d8279b3e42bfd7453b1cdac: Status 404 returned error can't find the container with id 4123013566d3bf816a4fee2e16d53831d7df919c9d8279b3e42bfd7453b1cdac Apr 24 21:28:38.650150 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:28:38.650017 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58290685_bbba_48e3_936e_34b4f4d27034.slice/crio-5fa23d9fff8d569002bbe888a0c70af849c93c8042abd037d559261fbeb6adf3 WatchSource:0}: Error finding container 5fa23d9fff8d569002bbe888a0c70af849c93c8042abd037d559261fbeb6adf3: Status 404 returned error can't find the container with id 5fa23d9fff8d569002bbe888a0c70af849c93c8042abd037d559261fbeb6adf3 Apr 24 21:28:38.926809 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.926771 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:23:36 +0000 UTC" deadline="2027-11-09 02:06:52.198877809 +0000 UTC" Apr 24 21:28:38.926809 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.926806 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13516h38m13.27207551s" Apr 24 21:28:38.986944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.986907 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerStarted","Data":"8de7d3275dd7ce8fc6ee19cdec8a2aa8cd2e029ee63f1eb5d2b1ea84f3e833f5"} Apr 24 21:28:38.989074 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.989048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" event={"ID":"6e6f541dfff28fedc7fdca7f3c5d9590","Type":"ContainerStarted","Data":"240870f77fb48d8bcf033bb6634ddab3d8b512ed5abb9547d7dc099cd33b2354"} Apr 24 21:28:38.990167 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.990140 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q6trf" event={"ID":"54414862-ab41-4d71-8c87-40ce2fe45ac4","Type":"ContainerStarted","Data":"726279270ebe7aba0f16c090750eca58b0f4ea0befc4a9852300740ea8d4b930"} Apr 24 21:28:38.991473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.991443 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vjdz9" event={"ID":"7e0b111b-0c8c-40e0-838f-d5768a4fd67a","Type":"ContainerStarted","Data":"37d05070b917820fe89033172d9502fc06cee54e1c6d87b126259ebfeda2c924"} Apr 24 21:28:38.992534 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.992492 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pksbn" event={"ID":"755f7e1a-7e39-472a-9a15-3ecbce3571b8","Type":"ContainerStarted","Data":"9187bcc1acd109e3a9e385f1d22303773a359d4c1ecc401b5cfd1ffa316f4b39"} Apr 24 21:28:38.993484 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.993462 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qj6pk" event={"ID":"58290685-bbba-48e3-936e-34b4f4d27034","Type":"ContainerStarted","Data":"5fa23d9fff8d569002bbe888a0c70af849c93c8042abd037d559261fbeb6adf3"} Apr 24 21:28:38.994572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.994537 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" event={"ID":"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5","Type":"ContainerStarted","Data":"4123013566d3bf816a4fee2e16d53831d7df919c9d8279b3e42bfd7453b1cdac"} Apr 24 21:28:38.995664 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.995644 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xdhfk" event={"ID":"f3c0ee91-65bf-4825-9def-c62c4806cc59","Type":"ContainerStarted","Data":"e935d85b17fad2857099b54c5bd4e374f24e21475fc8beccc19da255ba367b5f"} Apr 24 21:28:38.996740 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.996717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g6b6t" event={"ID":"eea02f54-e68c-472a-b138-f0b60cf3f2b8","Type":"ContainerStarted","Data":"e6c0c257e44f3ce1d7e34a39d262e07f6bc9832e01b2877948871cbcd3799844"} Apr 24 21:28:38.997898 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:38.997873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"34a338b0ddceb6b3cf45b787b5e3810421e93c5016b4f0bb79a071eb7b04ef90"} Apr 24 21:28:39.600455 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:39.600362 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:39.600455 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:39.600420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:39.600684 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.600572 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:39.600684 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.600631 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:41.600613806 +0000 UTC m=+6.142565386 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:39.601059 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.601033 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:39.601059 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.601057 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:39.601202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.601070 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:39.601202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.601110 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:41.601095889 +0000 UTC m=+6.143047478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:39.979181 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:39.978673 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:39.979181 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.978812 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:39.979709 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:39.979336 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:39.979709 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:39.979434 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:40.009090 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:40.009063 2567 generic.go:358] "Generic (PLEG): container finished" podID="e2e7dfa06eeb84e40d6ef86874d30644" containerID="165c9b2bb2ed4f4a77dc2e61bb49b42c5fef5e59f918f0393bd33bda4b42f77c" exitCode=0 Apr 24 21:28:40.009248 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:40.009184 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" event={"ID":"e2e7dfa06eeb84e40d6ef86874d30644","Type":"ContainerDied","Data":"165c9b2bb2ed4f4a77dc2e61bb49b42c5fef5e59f918f0393bd33bda4b42f77c"} Apr 24 21:28:40.023763 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:40.023717 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-249.ec2.internal" podStartSLOduration=3.023702785 podStartE2EDuration="3.023702785s" podCreationTimestamp="2026-04-24 21:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:39.003858845 +0000 UTC m=+3.545810445" watchObservedRunningTime="2026-04-24 21:28:40.023702785 +0000 UTC m=+4.565654386" Apr 24 21:28:41.016226 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:41.016190 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" event={"ID":"e2e7dfa06eeb84e40d6ef86874d30644","Type":"ContainerStarted","Data":"b7cfb52f216c1b57bdb21e383dbdef9f9eb4bee5c850260403625adae47f2e59"} Apr 24 21:28:41.616473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:41.616438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:41.616670 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:41.616494 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:41.616670 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.616649 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:41.616792 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.616700 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:45.616687467 +0000 UTC m=+10.158639048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:41.616977 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.616955 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:41.617043 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.616984 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:41.617043 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.616998 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:41.617141 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.617056 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:45.617038625 +0000 UTC m=+10.158990214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:41.978685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:41.978011 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:41.978685 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.978129 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:41.978685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:41.978475 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:41.978685 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:41.978600 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:43.978632 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:43.978064 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:43.978632 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:43.978235 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:43.978632 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:43.978276 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:43.978632 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:43.978398 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:45.646828 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:45.646795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:45.647292 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:45.646878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:45.647292 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647026 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:45.647292 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647043 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:45.647292 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647056 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:45.647292 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647114 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:53.647094281 +0000 UTC m=+18.189045875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:45.647557 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647507 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:45.647638 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.647577 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:53.647561108 +0000 UTC m=+18.189512699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:45.978139 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:45.978066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:45.978139 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:45.978089 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:45.978297 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.978193 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:45.979344 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:45.979300 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:46.875959 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.875907 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-249.ec2.internal" podStartSLOduration=9.875889866 podStartE2EDuration="9.875889866s" podCreationTimestamp="2026-04-24 21:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:41.031232668 +0000 UTC m=+5.573184267" watchObservedRunningTime="2026-04-24 21:28:46.875889866 +0000 UTC m=+11.417841465" Apr 24 21:28:46.876875 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.876688 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mxqrb"] Apr 24 21:28:46.879494 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.879473 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:46.879629 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:46.879560 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:46.956460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.956419 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-kubelet-config\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:46.956620 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.956505 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-dbus\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:46.956620 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:46.956559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057577 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.057501 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-kubelet-config\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.057639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-kubelet-config\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.057660 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-dbus\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.057708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057891 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.057781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-dbus\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.057891 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.057835 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:47.057972 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.057898 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:47.557880886 +0000 UTC m=+12.099832469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:47.562184 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.562150 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.562344 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.562315 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:47.562416 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.562389 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:48.562368601 +0000 UTC m=+13.104320192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:47.977606 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.977572 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:47.978041 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.977572 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:47.978041 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.977690 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:47.978041 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:47.977584 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:47.978041 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.977758 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:47.978041 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:47.977913 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:48.570992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:48.570948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:48.571149 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:48.571074 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:48.571149 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:48.571138 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:50.571120147 +0000 UTC m=+15.113071724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:49.978066 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:49.978037 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:49.978578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:49.978036 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:49.978578 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:49.978162 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:49.978578 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:49.978207 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:49.978578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:49.978040 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:49.978578 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:49.978281 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:50.585599 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:50.585558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:50.585773 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:50.585698 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:50.585773 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:50.585772 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:54.585751551 +0000 UTC m=+19.127703129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:51.978297 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:51.978260 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:51.978741 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:51.978268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:51.978741 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:51.978387 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:51.978741 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:51.978268 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:51.978741 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:51.978472 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:51.978741 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:51.978573 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:53.708392 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:53.708356 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:53.708401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708535 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708556 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708565 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708562 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708617 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.708600253 +0000 UTC m=+34.250551841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:28:53.708845 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.708633 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.70862623 +0000 UTC m=+34.250577807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:53.977353 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:53.977269 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:53.977505 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:53.977278 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:53.977505 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.977401 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:53.977505 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:53.977278 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:53.977505 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.977468 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:53.977716 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:53.977569 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:54.615661 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:54.615629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:54.615836 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:54.615743 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:54.615836 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:54.615813 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:02.615795222 +0000 UTC m=+27.157746805 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:28:55.979314 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:55.979015 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:55.979844 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:55.979074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:55.979844 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:55.979410 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:55.979844 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:55.979089 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:55.979844 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:55.979485 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:55.979844 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:55.979517 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:56.043865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.043827 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xdhfk" event={"ID":"f3c0ee91-65bf-4825-9def-c62c4806cc59","Type":"ContainerStarted","Data":"f3908cdae2cb1f78b46c555a7a6aa1a723247003ad558755471e4241f7a9bba3"} Apr 24 21:28:56.045264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.045223 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g6b6t" event={"ID":"eea02f54-e68c-472a-b138-f0b60cf3f2b8","Type":"ContainerStarted","Data":"1e17412fe38a253850fb89b97dff87e7e0ef472c20f77ba7f7849bdaa2b088d4"} Apr 24 21:28:56.047398 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.047378 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:28:56.051260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.048224 2567 generic.go:358] "Generic (PLEG): container finished" podID="14209990-6df9-445b-a825-ae10b8c6b84d" containerID="49b6b19119654b26f94751ea7c17054ec96281360ca80ed2c712621916545d2b" exitCode=1 Apr 24 21:28:56.051260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.048291 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"ff14cfb330f4985eb74a4f38b065a0c6d363e87a79e6eb544af56afc12a5805a"} Apr 24 21:28:56.051260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.048321 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"509545c467fc56727033dbb0cec8604841e89c50485474ab641b5faf5ebeea8e"} Apr 24 21:28:56.051260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.048365 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerDied","Data":"49b6b19119654b26f94751ea7c17054ec96281360ca80ed2c712621916545d2b"} Apr 24 21:28:56.051260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.048490 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"d160424dafe3e84aeaad6ec047e33e8c81386c4909da31c7efc2cec19ce9663c"} Apr 24 21:28:56.052936 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.052879 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="8187cc711a16eca39c4c0d0b5a053e7d84893f396edb3476c5503a637f205cf2" exitCode=0 Apr 24 21:28:56.052936 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.052909 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"8187cc711a16eca39c4c0d0b5a053e7d84893f396edb3476c5503a637f205cf2"} Apr 24 21:28:56.054366 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.054267 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vjdz9" event={"ID":"7e0b111b-0c8c-40e0-838f-d5768a4fd67a","Type":"ContainerStarted","Data":"754269aa62fd630a7fe0fa076907c19cf7f59bd679e216648e089a9fa40d41f8"} Apr 24 21:28:56.055955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.055935 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pksbn" event={"ID":"755f7e1a-7e39-472a-9a15-3ecbce3571b8","Type":"ContainerStarted","Data":"adadd01832a4172f4c7ea30697f424df4327eb76c0dbedb3514f0a8d6e941ab3"} Apr 24 21:28:56.057567 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.057539 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qj6pk" event={"ID":"58290685-bbba-48e3-936e-34b4f4d27034","Type":"ContainerStarted","Data":"68648c3184ba729230f75a78b0ce18fbb00ec2f471c2a97c4f60595e18228f2a"} Apr 24 21:28:56.058845 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.058715 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xdhfk" podStartSLOduration=3.546753807 podStartE2EDuration="20.058701464s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.650480602 +0000 UTC m=+3.192432193" lastFinishedPulling="2026-04-24 21:28:55.162428262 +0000 UTC m=+19.704379850" observedRunningTime="2026-04-24 21:28:56.058684811 +0000 UTC m=+20.600636411" watchObservedRunningTime="2026-04-24 21:28:56.058701464 +0000 UTC m=+20.600653063" Apr 24 21:28:56.061995 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.061969 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" event={"ID":"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5","Type":"ContainerStarted","Data":"726c77e48085826e38544657ff4afed36b3b710826f3ebe19a73a5639b2be545"} Apr 24 21:28:56.088117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.088078 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vjdz9" podStartSLOduration=3.497780992 podStartE2EDuration="20.088067891s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.64871391 +0000 UTC m=+3.190665501" lastFinishedPulling="2026-04-24 21:28:55.239000818 +0000 UTC m=+19.780952400" observedRunningTime="2026-04-24 21:28:56.087903627 +0000 UTC m=+20.629855227" watchObservedRunningTime="2026-04-24 21:28:56.088067891 +0000 UTC m=+20.630019489" Apr 24 21:28:56.112549 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.112506 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g6b6t" podStartSLOduration=11.331323843 podStartE2EDuration="20.112493351s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.643640244 +0000 UTC m=+3.185591834" lastFinishedPulling="2026-04-24 21:28:47.424809754 +0000 UTC m=+11.966761342" observedRunningTime="2026-04-24 21:28:56.099763016 +0000 UTC m=+20.641714616" watchObservedRunningTime="2026-04-24 21:28:56.112493351 +0000 UTC m=+20.654444950" Apr 24 21:28:56.112632 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.112591 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pksbn" podStartSLOduration=3.51967582 podStartE2EDuration="20.112587188s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.646191111 +0000 UTC m=+3.188142688" lastFinishedPulling="2026-04-24 21:28:55.23910246 +0000 UTC m=+19.781054056" observedRunningTime="2026-04-24 21:28:56.112353054 +0000 UTC m=+20.654304653" watchObservedRunningTime="2026-04-24 21:28:56.112587188 +0000 UTC m=+20.654538786" Apr 24 21:28:56.124974 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.124932 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qj6pk" podStartSLOduration=3.54272031 podStartE2EDuration="20.124924397s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.651628197 +0000 UTC m=+3.193579789" lastFinishedPulling="2026-04-24 21:28:55.233832284 +0000 UTC m=+19.775783876" observedRunningTime="2026-04-24 21:28:56.124855776 +0000 UTC m=+20.666807376" watchObservedRunningTime="2026-04-24 21:28:56.124924397 +0000 UTC m=+20.666875996" Apr 24 21:28:56.576978 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.576851 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:56.577317 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.577303 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:28:56.604316 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.604284 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:28:56.944824 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.944721 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:28:56.604304388Z","UUID":"73110ae3-5b98-4ef3-b60f-8bc2acd904b4","Handler":null,"Name":"","Endpoint":""} Apr 24 21:28:56.947375 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.947349 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:28:56.947375 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:56.947381 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:28:57.065457 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.065422 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q6trf" event={"ID":"54414862-ab41-4d71-8c87-40ce2fe45ac4","Type":"ContainerStarted","Data":"d8bb644a14f901b9d4f97bb86a275509963ccf6f390332e40ec9365cbeb8b3f1"} Apr 24 21:28:57.067332 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.067302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" event={"ID":"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5","Type":"ContainerStarted","Data":"b7663db6978334075596010661f13f26d80f8a3e4c3582b59619bb1fa326379a"} Apr 24 21:28:57.070110 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.070086 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:28:57.070469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.070437 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"9b2add356c0812dc82e50a4bc823cd33401479f443bec88d3d96c3366f4fc8aa"} Apr 24 21:28:57.070469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.070464 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"fa211efddcbecf40ce83f4af3f012034749ddb1470b1a0c5e34b523268e6d554"} Apr 24 21:28:57.081912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.081864 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-q6trf" podStartSLOduration=4.5664941169999995 podStartE2EDuration="21.081849157s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.647065682 +0000 UTC m=+3.189017262" lastFinishedPulling="2026-04-24 21:28:55.162420718 +0000 UTC m=+19.704372302" observedRunningTime="2026-04-24 21:28:57.081285793 +0000 UTC m=+21.623237396" watchObservedRunningTime="2026-04-24 21:28:57.081849157 +0000 UTC m=+21.623800759" Apr 24 21:28:57.978289 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.978262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:57.978398 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.978262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:57.978398 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:57.978366 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:28:57.978494 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:57.978414 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:57.978494 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:57.978479 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:57.978608 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:57.978553 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:58.074649 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:58.074570 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" event={"ID":"2ee6fabb-9f77-4f9a-a75d-586f99d9c0b5","Type":"ContainerStarted","Data":"767b71d80a7ed6a793599d28ecfc4487546f93936c123143402378351b71db5e"} Apr 24 21:28:58.074649 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:58.074632 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:28:58.105830 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:58.105780 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hgldp" podStartSLOduration=2.968801068 podStartE2EDuration="22.105763994s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.650602282 +0000 UTC m=+3.192553860" lastFinishedPulling="2026-04-24 21:28:57.787565194 +0000 UTC m=+22.329516786" observedRunningTime="2026-04-24 21:28:58.105293045 +0000 UTC m=+22.647244664" watchObservedRunningTime="2026-04-24 21:28:58.105763994 +0000 UTC m=+22.647715594" Apr 24 21:28:59.080181 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:59.080152 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:28:59.080837 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:59.080562 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"3e499e4580599f38d55f2cfafb955a3c364bf9f38680b06a99ab9f9d7df2e7c7"} Apr 24 21:28:59.977929 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:59.977895 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:28:59.978106 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:59.977902 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:28:59.978106 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:59.978017 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:28:59.978225 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:59.978106 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:28:59.978225 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:28:59.977901 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:28:59.978310 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:28:59.978239 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:01.088873 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.088715 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:29:01.089357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.089217 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"c7e061d8c63a148d1c6663f6d5246241fbe541413edf551c0e43ab8e01690df7"} Apr 24 21:29:01.089554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.089498 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:01.089554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.089541 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:01.089765 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.089740 2567 scope.go:117] "RemoveContainer" containerID="49b6b19119654b26f94751ea7c17054ec96281360ca80ed2c712621916545d2b" Apr 24 21:29:01.090992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.090967 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="1bd4563c0e13a52fce34e87f609cc250a327dd7f692ec66ddc79d88d9e18bd76" exitCode=0 Apr 24 21:29:01.091099 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.090998 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"1bd4563c0e13a52fce34e87f609cc250a327dd7f692ec66ddc79d88d9e18bd76"} Apr 24 21:29:01.105978 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.105957 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:01.106562 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.106541 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:01.977683 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.977651 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:01.977767 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.977680 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:01.977767 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:01.977680 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:01.977837 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:01.977764 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:29:01.977894 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:01.977868 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:01.977982 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:01.977962 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:29:02.096882 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.096856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:29:02.097301 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.097275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" event={"ID":"14209990-6df9-445b-a825-ae10b8c6b84d","Type":"ContainerStarted","Data":"c7c2c82d2f268ffec59a95f53b68cba212b18dd6229573afa445f1af4056fee0"} Apr 24 21:29:02.097442 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.097424 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:02.100206 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.099990 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="e179b7a540132bd628cedb1984afa8fdfae1fd9795a765fb7ec76e3c0ebd4e04" exitCode=0 Apr 24 21:29:02.100206 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.100052 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"e179b7a540132bd628cedb1984afa8fdfae1fd9795a765fb7ec76e3c0ebd4e04"} Apr 24 21:29:02.132560 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.132458 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" podStartSLOduration=9.34159093 podStartE2EDuration="26.132445736s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.642392535 +0000 UTC m=+3.184344127" lastFinishedPulling="2026-04-24 21:28:55.433247338 +0000 UTC m=+19.975198933" observedRunningTime="2026-04-24 21:29:02.131914991 +0000 UTC m=+26.673866590" watchObservedRunningTime="2026-04-24 21:29:02.132445736 +0000 UTC m=+26.674397335" Apr 24 21:29:02.374829 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.374799 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mxwm9"] Apr 24 21:29:02.374958 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.374923 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:02.375045 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:02.375021 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:29:02.382385 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.382352 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fnmhx"] Apr 24 21:29:02.382509 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.382457 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:02.382635 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:02.382578 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:02.383126 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.383104 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mxqrb"] Apr 24 21:29:02.383217 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.383204 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:02.383350 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:02.383321 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:29:02.633808 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.633732 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:02.680304 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:02.680278 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:02.680452 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:02.680401 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:02.680501 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:02.680457 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret podName:67ffea21-d8d4-46a1-ab6a-9c97c0cb589e nodeName:}" failed. No retries permitted until 2026-04-24 21:29:18.680442433 +0000 UTC m=+43.222394009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret") pod "global-pull-secret-syncer-mxqrb" (UID: "67ffea21-d8d4-46a1-ab6a-9c97c0cb589e") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:29:03.103834 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:03.103803 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="99fdad77b7724901c02c52c47dfdcc9dcfa5f99bcbfe2bd3af18d4abcfd72b92" exitCode=0 Apr 24 21:29:03.104195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:03.103882 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"99fdad77b7724901c02c52c47dfdcc9dcfa5f99bcbfe2bd3af18d4abcfd72b92"} Apr 24 21:29:03.978214 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:03.978181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:03.978376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:03.978178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:03.978376 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:03.978314 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:03.978482 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:03.978383 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:29:03.978482 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:03.978181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:03.978625 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:03.978481 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:29:04.492142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:04.492115 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:29:04.492509 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:04.492248 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:29:04.492884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:04.492861 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g6b6t" Apr 24 21:29:05.978396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:05.978363 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:05.979144 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:05.978454 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:29:05.979144 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:05.978490 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:05.979144 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:05.978601 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:05.979144 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:05.978654 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:05.979144 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:05.978762 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:29:07.977948 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:07.977912 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:07.978403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:07.977975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:07.978403 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:07.978081 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:29:07.978403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:07.978172 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:07.978403 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:07.978274 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mxqrb" podUID="67ffea21-d8d4-46a1-ab6a-9c97c0cb589e" Apr 24 21:29:07.978601 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:07.978397 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mxwm9" podUID="8054fc2a-a4f1-4d30-9017-96d3932c580f" Apr 24 21:29:08.142372 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.142345 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-249.ec2.internal" event="NodeReady" Apr 24 21:29:08.142545 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.142492 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:29:08.182785 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.182698 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj"] Apr 24 21:29:08.214029 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.214005 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8fbf6b564-dm5w4"] Apr 24 21:29:08.214175 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.214161 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.216390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.216360 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-54st7\"" Apr 24 21:29:08.216572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.216552 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:29:08.216572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.216556 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:29:08.238030 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.237959 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj"] Apr 24 21:29:08.238118 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.238047 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8fbf6b564-dm5w4"] Apr 24 21:29:08.238118 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.238066 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5tp2b"] Apr 24 21:29:08.238118 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.238077 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.241730 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.241708 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:29:08.241872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.241854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:29:08.241970 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.241875 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:29:08.241970 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.241943 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ptjx6\"" Apr 24 21:29:08.249333 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.248440 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:29:08.265247 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.265226 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rrff5"] Apr 24 21:29:08.265472 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.265446 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.267900 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.267825 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:29:08.267900 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.267828 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:29:08.268156 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.268136 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:29:08.292129 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.292108 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5tp2b"] Apr 24 21:29:08.292225 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.292185 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rrff5"] Apr 24 21:29:08.292290 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.292238 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.294717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.294501 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:29:08.294717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.294620 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:29:08.294717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.294698 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:29:08.294910 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.294776 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:29:08.324377 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgbz9\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324566 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324518 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324562 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324598 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.324629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324638 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.324745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.324722 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.425811 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.425950 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.425950 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425842 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.425950 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgbz9\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.425950 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2xsg\" (UniqueName: \"kubernetes.io/projected/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-kube-api-access-b2xsg\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.425950 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.426179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.425955 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-config-volume\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.426179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426008 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b4g5\" (UniqueName: \"kubernetes.io/projected/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-kube-api-access-8b4g5\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.426179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426061 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426150 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-tmp-dir\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426186 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426214 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426279 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426391 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.426377 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:08.426682 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.426449 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:08.926429686 +0000 UTC m=+33.468381281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:08.426682 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.426577 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:08.426682 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.426593 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:08.426682 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.426648 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:08.926631704 +0000 UTC m=+33.468583295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:08.426682 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.426933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426778 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.426933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426855 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.427023 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.426989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.430097 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.430076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.430210 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.430102 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.435982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.435907 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgbz9\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.435982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.435946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.526769 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.526586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.526911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.526795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2xsg\" (UniqueName: \"kubernetes.io/projected/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-kube-api-access-b2xsg\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.526911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.526819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.526911 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.526720 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:08.526911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.526862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-config-volume\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.526911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.526890 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b4g5\" (UniqueName: \"kubernetes.io/projected/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-kube-api-access-8b4g5\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.527124 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.526959 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.026936265 +0000 UTC m=+33.568887851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:08.527124 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.527003 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-tmp-dir\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.527124 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.527040 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:08.527124 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.527095 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.027082414 +0000 UTC m=+33.569034004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:08.527370 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.527353 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-tmp-dir\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.527610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.527592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-config-volume\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.539040 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.539010 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2xsg\" (UniqueName: \"kubernetes.io/projected/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-kube-api-access-b2xsg\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:08.539133 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.539116 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b4g5\" (UniqueName: \"kubernetes.io/projected/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-kube-api-access-8b4g5\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:08.930635 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.930601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:08.930635 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:08.930637 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:08.930858 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.930744 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:08.930858 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.930804 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.930788882 +0000 UTC m=+34.472740465 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:08.930858 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.930808 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:08.930858 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.930824 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:08.931020 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:08.930883 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:09.930867536 +0000 UTC m=+34.472819135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:09.032114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.032095 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:09.032418 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.032122 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:09.032418 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.032241 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:09.032418 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.032247 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:09.032418 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.032279 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:10.03226626 +0000 UTC m=+34.574217837 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:09.032418 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.032295 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:10.032287006 +0000 UTC m=+34.574238585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:09.738268 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.738236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:09.738467 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.738276 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:09.738467 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738394 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:09.738467 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738447 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:41.738431492 +0000 UTC m=+66.280383072 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:29:09.738608 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738512 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:29:09.738608 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738554 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:29:09.738608 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738567 2567 projected.go:194] Error preparing data for projected volume kube-api-access-tx47b for pod openshift-network-diagnostics/network-check-target-mxwm9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:09.738696 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.738621 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b podName:8054fc2a-a4f1-4d30-9017-96d3932c580f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:41.738608579 +0000 UTC m=+66.280560159 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-tx47b" (UniqueName: "kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b") pod "network-check-target-mxwm9" (UID: "8054fc2a-a4f1-4d30-9017-96d3932c580f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:29:09.939753 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.939715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:09.939916 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.939754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:09.939916 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.939852 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:09.939916 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.939883 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:09.939916 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.939893 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:09.939916 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.939913 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:11.939898828 +0000 UTC m=+36.481850404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:09.940080 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:09.939929 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:11.939918166 +0000 UTC m=+36.481869743 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:09.977888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.977866 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:09.977983 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.977892 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:09.978147 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.978115 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:09.980321 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.980302 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:29:09.981042 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.981024 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:29:09.981152 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.981139 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:29:09.981203 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.981185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:29:09.981247 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.981185 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6sxw\"" Apr 24 21:29:09.981281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:09.981249 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:29:10.041040 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:10.041022 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:10.041320 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:10.041049 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:10.041320 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:10.041163 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:10.041320 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:10.041225 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.041211044 +0000 UTC m=+36.583162621 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:10.041320 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:10.041167 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:10.041320 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:10.041292 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:12.041278772 +0000 UTC m=+36.583230363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:10.120133 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:10.120106 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="1a363ce228a22867cf6542b8192a87169587f7fb72c5c9293f3d77fc4c15c234" exitCode=0 Apr 24 21:29:10.120240 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:10.120156 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"1a363ce228a22867cf6542b8192a87169587f7fb72c5c9293f3d77fc4c15c234"} Apr 24 21:29:11.124095 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:11.124064 2567 generic.go:358] "Generic (PLEG): container finished" podID="13ab36a9-43a5-4f65-a1cc-a42a4f3c183d" containerID="dcfd6fd1ec03f51b2d8eedd7cd4fb02a494375f20881d407da8d217e05ead54d" exitCode=0 Apr 24 21:29:11.124511 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:11.124118 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerDied","Data":"dcfd6fd1ec03f51b2d8eedd7cd4fb02a494375f20881d407da8d217e05ead54d"} Apr 24 21:29:11.955929 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:11.955897 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:11.956047 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:11.955934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:11.956086 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:11.956053 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:11.956131 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:11.956120 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.956104974 +0000 UTC m=+40.498056557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:11.956176 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:11.956056 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:11.956176 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:11.956162 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:11.956236 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:11.956201 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:15.956190117 +0000 UTC m=+40.498141709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:12.056254 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:12.056230 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:12.056254 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:12.056259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:12.056427 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:12.056361 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:12.056427 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:12.056383 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:12.056427 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:12.056412 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:16.056399705 +0000 UTC m=+40.598351281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:12.056427 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:12.056426 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:16.05641962 +0000 UTC m=+40.598371197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:12.128945 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:12.128916 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" event={"ID":"13ab36a9-43a5-4f65-a1cc-a42a4f3c183d","Type":"ContainerStarted","Data":"8bf2dae47095a82dec073956e98dfa8d9595425db9c459be50e8941299d8507d"} Apr 24 21:29:12.163973 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:12.163935 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x7f8g" podStartSLOduration=5.790104601 podStartE2EDuration="36.163915817s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:38.64040651 +0000 UTC m=+3.182358098" lastFinishedPulling="2026-04-24 21:29:09.014217722 +0000 UTC m=+33.556169314" observedRunningTime="2026-04-24 21:29:12.163586726 +0000 UTC m=+36.705538341" watchObservedRunningTime="2026-04-24 21:29:12.163915817 +0000 UTC m=+36.705867429" Apr 24 21:29:15.982923 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:15.982889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:15.982923 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:15.982923 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:15.983318 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:15.983015 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:15.983318 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:15.983064 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:15.983318 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:15.983072 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:23.983056614 +0000 UTC m=+48.525008191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:15.983318 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:15.983076 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:15.983318 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:15.983111 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:23.983100247 +0000 UTC m=+48.525051824 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:16.083865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:16.083837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:16.083865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:16.083865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:16.084072 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:16.083944 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:16.084072 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:16.083980 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:24.083969612 +0000 UTC m=+48.625921188 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:16.084072 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:16.083999 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:16.084072 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:16.084070 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:24.084052019 +0000 UTC m=+48.626003603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:18.701474 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:18.701425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:18.704928 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:18.704900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67ffea21-d8d4-46a1-ab6a-9c97c0cb589e-original-pull-secret\") pod \"global-pull-secret-syncer-mxqrb\" (UID: \"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e\") " pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:18.997003 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:18.996921 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mxqrb" Apr 24 21:29:19.120444 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:19.120416 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mxqrb"] Apr 24 21:29:19.123724 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:29:19.123700 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67ffea21_d8d4_46a1_ab6a_9c97c0cb589e.slice/crio-a85c0dd77041a47a4c7422cf9082705ccfc3c6492a980e8fdea9d69f299de1ef WatchSource:0}: Error finding container a85c0dd77041a47a4c7422cf9082705ccfc3c6492a980e8fdea9d69f299de1ef: Status 404 returned error can't find the container with id a85c0dd77041a47a4c7422cf9082705ccfc3c6492a980e8fdea9d69f299de1ef Apr 24 21:29:19.143474 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:19.143450 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mxqrb" event={"ID":"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e","Type":"ContainerStarted","Data":"a85c0dd77041a47a4c7422cf9082705ccfc3c6492a980e8fdea9d69f299de1ef"} Apr 24 21:29:24.040772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.040739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:24.040772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.040775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:24.041153 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.040872 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:24.041153 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.040892 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:24.041153 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.040902 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:24.041153 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.040940 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:40.040924647 +0000 UTC m=+64.582876224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:24.041153 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.040954 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:40.040947786 +0000 UTC m=+64.582899363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:24.141464 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.141440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:24.141464 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.141468 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:24.141670 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.141594 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:24.141670 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.141646 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:29:40.141632885 +0000 UTC m=+64.683584466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:24.141800 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.141599 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:24.141800 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:24.141714 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:29:40.141703457 +0000 UTC m=+64.683655034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:24.154991 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.154964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mxqrb" event={"ID":"67ffea21-d8d4-46a1-ab6a-9c97c0cb589e","Type":"ContainerStarted","Data":"47c3235280f511717da29f3da05a6a92f6e157a60bab21df165f213074e3dedf"} Apr 24 21:29:24.170767 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:24.170731 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mxqrb" podStartSLOduration=34.282197635 podStartE2EDuration="38.170720634s" podCreationTimestamp="2026-04-24 21:28:46 +0000 UTC" firstStartedPulling="2026-04-24 21:29:19.125576645 +0000 UTC m=+43.667528235" lastFinishedPulling="2026-04-24 21:29:23.014099649 +0000 UTC m=+47.556051234" observedRunningTime="2026-04-24 21:29:24.169945968 +0000 UTC m=+48.711897566" watchObservedRunningTime="2026-04-24 21:29:24.170720634 +0000 UTC m=+48.712672233" Apr 24 21:29:34.116086 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:34.116058 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qr6dh" Apr 24 21:29:40.046407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:40.046370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:29:40.046407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:40.046406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:29:40.046906 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.046504 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:29:40.046906 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.046514 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:29:40.046906 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.046566 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:29:40.046906 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.046585 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:12.046569908 +0000 UTC m=+96.588521485 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:29:40.046906 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.046633 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:12.046613956 +0000 UTC m=+96.588565538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:29:40.147473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:40.147440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:29:40.147473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:40.147485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:29:40.147676 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.147573 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:29:40.147676 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.147617 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:12.147603177 +0000 UTC m=+96.689554759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:29:40.147752 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.147693 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:29:40.147752 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:40.147734 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:30:12.147723406 +0000 UTC m=+96.689674982 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:29:41.759439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.759401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:41.759439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.759446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:29:41.762137 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.762119 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:29:41.762201 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.762168 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:29:41.770175 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:41.770153 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:29:41.770231 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:29:41.770219 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:30:45.770199453 +0000 UTC m=+130.312151044 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : secret "metrics-daemon-secret" not found Apr 24 21:29:41.772208 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.772192 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:29:41.783350 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.783328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx47b\" (UniqueName: \"kubernetes.io/projected/8054fc2a-a4f1-4d30-9017-96d3932c580f-kube-api-access-tx47b\") pod \"network-check-target-mxwm9\" (UID: \"8054fc2a-a4f1-4d30-9017-96d3932c580f\") " pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:41.795654 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.795630 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6sxw\"" Apr 24 21:29:41.804216 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.804198 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:41.921129 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:41.921102 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mxwm9"] Apr 24 21:29:41.924627 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:29:41.924602 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8054fc2a_a4f1_4d30_9017_96d3932c580f.slice/crio-1ed78dc526b760f64b5ede24465c7fc9faa36a8076a72e9d7a7777a15fadf776 WatchSource:0}: Error finding container 1ed78dc526b760f64b5ede24465c7fc9faa36a8076a72e9d7a7777a15fadf776: Status 404 returned error can't find the container with id 1ed78dc526b760f64b5ede24465c7fc9faa36a8076a72e9d7a7777a15fadf776 Apr 24 21:29:42.188804 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:42.188772 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mxwm9" event={"ID":"8054fc2a-a4f1-4d30-9017-96d3932c580f","Type":"ContainerStarted","Data":"1ed78dc526b760f64b5ede24465c7fc9faa36a8076a72e9d7a7777a15fadf776"} Apr 24 21:29:45.196161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:45.196122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mxwm9" event={"ID":"8054fc2a-a4f1-4d30-9017-96d3932c580f","Type":"ContainerStarted","Data":"42c36531d1e6d80ed3a486c109216f4539d23aabb68c10e27eb553885a6f1d66"} Apr 24 21:29:45.196704 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:45.196254 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:29:45.212426 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:29:45.212386 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mxwm9" podStartSLOduration=66.628117129 podStartE2EDuration="1m9.212375619s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:29:41.9270839 +0000 UTC m=+66.469035477" lastFinishedPulling="2026-04-24 21:29:44.511342381 +0000 UTC m=+69.053293967" observedRunningTime="2026-04-24 21:29:45.211415631 +0000 UTC m=+69.753367229" watchObservedRunningTime="2026-04-24 21:29:45.212375619 +0000 UTC m=+69.754327218" Apr 24 21:30:12.067986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:12.067948 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:12.067993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.068099 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.068116 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.068136 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.068180 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:16.068164054 +0000 UTC m=+160.610115646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:30:12.068388 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.068214 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:16.068193883 +0000 UTC m=+160.610145467 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:30:12.169162 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:12.169101 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:30:12.169162 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:12.169128 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:30:12.169307 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.169210 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:30:12.169307 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.169219 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:30:12.169307 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.169263 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:31:16.169250957 +0000 UTC m=+160.711202534 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:30:12.169307 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:12.169287 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:16.169274825 +0000 UTC m=+160.711226406 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:30:16.199865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:16.199835 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mxwm9" Apr 24 21:30:45.796866 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:30:45.796819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:30:45.797298 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:45.796975 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:30:45.797298 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:30:45.797045 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs podName:b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:47.797029512 +0000 UTC m=+252.338981093 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs") pod "network-metrics-daemon-fnmhx" (UID: "b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58") : secret "metrics-daemon-secret" not found Apr 24 21:31:11.224985 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:11.224946 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" podUID="6b55e3f0-b53c-4afe-88de-6b0ada988fc9" Apr 24 21:31:11.252265 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:11.252237 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" podUID="e10024bb-2bc4-4af2-987b-a3c774bd83a1" Apr 24 21:31:11.275513 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:11.275486 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5tp2b" podUID="6a5a230a-7dee-4c4a-882c-d0cb1e017e43" Apr 24 21:31:11.301767 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:11.301744 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rrff5" podUID="3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f" Apr 24 21:31:11.352171 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:11.352155 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:31:11.352171 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:11.352164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:31:11.352302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:11.352164 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:31:11.352302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:11.352250 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:31:12.987772 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:12.987725 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fnmhx" podUID="b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58" Apr 24 21:31:13.463023 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.462989 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl"] Apr 24 21:31:13.465748 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.465725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" Apr 24 21:31:13.468174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.468156 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.468342 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.468329 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-sxhfw\"" Apr 24 21:31:13.468585 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.468572 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.474653 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.474631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl"] Apr 24 21:31:13.563355 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.563331 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr"] Apr 24 21:31:13.565911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.565896 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.568134 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.568109 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rf2zg\"" Apr 24 21:31:13.568373 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.568358 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:31:13.568615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.568596 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.568751 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.568597 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:31:13.569579 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.569559 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.569719 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.569702 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx"] Apr 24 21:31:13.572607 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.572587 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-crpwz"] Apr 24 21:31:13.572857 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.572740 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.574912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.574895 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:31:13.575142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.575122 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6fb67cdcfb-x2njp"] Apr 24 21:31:13.575240 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.575160 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xsb94\"" Apr 24 21:31:13.575240 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.575173 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.575344 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.575277 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.576024 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.576006 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:31:13.576114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.576009 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.577744 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.577725 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr"] Apr 24 21:31:13.577845 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.577831 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.577905 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.577857 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.578176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.578157 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wgtmw\"" Apr 24 21:31:13.578300 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.578202 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:31:13.578490 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.578475 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:31:13.578572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.578507 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.579932 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.579914 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:31:13.580038 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.579953 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.580094 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.580046 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.580772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.580269 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:31:13.580772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.580386 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:31:13.580772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.580470 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-4ftd8\"" Apr 24 21:31:13.580772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.580491 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:31:13.581890 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.581869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn2np\" (UniqueName: \"kubernetes.io/projected/5acaffdc-65cb-4b11-a572-3f3d38f308c0-kube-api-access-dn2np\") pod \"volume-data-source-validator-7c6cbb6c87-5hcrl\" (UID: \"5acaffdc-65cb-4b11-a572-3f3d38f308c0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" Apr 24 21:31:13.583613 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.583593 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx"] Apr 24 21:31:13.584553 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.584398 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:31:13.584700 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.584684 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-crpwz"] Apr 24 21:31:13.585569 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.585549 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6fb67cdcfb-x2njp"] Apr 24 21:31:13.682769 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682743 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-tmp\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.682880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682774 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32477472-4713-476c-ac5d-4d2a735ad4b7-serving-cert\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.682880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682798 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-stats-auth\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.682880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682821 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fv69\" (UniqueName: \"kubernetes.io/projected/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-kube-api-access-2fv69\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.682880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn2np\" (UniqueName: \"kubernetes.io/projected/5acaffdc-65cb-4b11-a572-3f3d38f308c0-kube-api-access-dn2np\") pod \"volume-data-source-validator-7c6cbb6c87-5hcrl\" (UID: \"5acaffdc-65cb-4b11-a572-3f3d38f308c0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74fh7\" (UniqueName: \"kubernetes.io/projected/23da5cf6-4806-441a-8d04-9ddc0c84d07b-kube-api-access-74fh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.682951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88sl8\" (UniqueName: \"kubernetes.io/projected/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-kube-api-access-88sl8\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683007 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683040 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-default-certificate\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n22fn\" (UniqueName: \"kubernetes.io/projected/32477472-4713-476c-ac5d-4d2a735ad4b7-kube-api-access-n22fn\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-snapshots\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.683176 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683126 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683186 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683224 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683244 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683260 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23da5cf6-4806-441a-8d04-9ddc0c84d07b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683288 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.683413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.683303 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23da5cf6-4806-441a-8d04-9ddc0c84d07b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.692651 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.692633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn2np\" (UniqueName: \"kubernetes.io/projected/5acaffdc-65cb-4b11-a572-3f3d38f308c0-kube-api-access-dn2np\") pod \"volume-data-source-validator-7c6cbb6c87-5hcrl\" (UID: \"5acaffdc-65cb-4b11-a572-3f3d38f308c0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" Apr 24 21:31:13.773728 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.773667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" Apr 24 21:31:13.784390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784363 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-tmp\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784391 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32477472-4713-476c-ac5d-4d2a735ad4b7-serving-cert\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784408 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-stats-auth\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.784471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784423 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fv69\" (UniqueName: \"kubernetes.io/projected/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-kube-api-access-2fv69\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.784627 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784574 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74fh7\" (UniqueName: \"kubernetes.io/projected/23da5cf6-4806-441a-8d04-9ddc0c84d07b-kube-api-access-74fh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.784627 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88sl8\" (UniqueName: \"kubernetes.io/projected/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-kube-api-access-88sl8\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.784732 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784732 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-default-certificate\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.784732 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n22fn\" (UniqueName: \"kubernetes.io/projected/32477472-4713-476c-ac5d-4d2a735ad4b7-kube-api-access-n22fn\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784897 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784762 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-snapshots\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784897 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784776 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-tmp\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.784897 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.784793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.785077 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:13.785022 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.285002678 +0000 UTC m=+158.826954272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : configmap references non-existent config key: service-ca.crt Apr 24 21:31:13.785169 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785137 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.785327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785205 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.785327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.785327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785267 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23da5cf6-4806-441a-8d04-9ddc0c84d07b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.785327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785268 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-service-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.785574 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785335 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32477472-4713-476c-ac5d-4d2a735ad4b7-snapshots\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.785574 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.785574 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.785406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23da5cf6-4806-441a-8d04-9ddc0c84d07b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.785723 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:13.785592 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:13.785723 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:13.785655 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.285638535 +0000 UTC m=+158.827590130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:13.785723 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:13.785717 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:31:13.785880 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:13.785751 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:14.285739894 +0000 UTC m=+158.827691471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : secret "router-metrics-certs-default" not found Apr 24 21:31:13.786122 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.786029 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23da5cf6-4806-441a-8d04-9ddc0c84d07b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.786741 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.786665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32477472-4713-476c-ac5d-4d2a735ad4b7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.786850 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.786830 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.788065 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.788045 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32477472-4713-476c-ac5d-4d2a735ad4b7-serving-cert\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.788284 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.788248 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-default-certificate\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.788568 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.788548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-stats-auth\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.789045 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.789028 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/23da5cf6-4806-441a-8d04-9ddc0c84d07b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.796942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.796922 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74fh7\" (UniqueName: \"kubernetes.io/projected/23da5cf6-4806-441a-8d04-9ddc0c84d07b-kube-api-access-74fh7\") pod \"kube-storage-version-migrator-operator-6769c5d45-f22xr\" (UID: \"23da5cf6-4806-441a-8d04-9ddc0c84d07b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.797307 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.797289 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88sl8\" (UniqueName: \"kubernetes.io/projected/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-kube-api-access-88sl8\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:13.797425 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.797405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n22fn\" (UniqueName: \"kubernetes.io/projected/32477472-4713-476c-ac5d-4d2a735ad4b7-kube-api-access-n22fn\") pod \"insights-operator-585dfdc468-crpwz\" (UID: \"32477472-4713-476c-ac5d-4d2a735ad4b7\") " pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.797489 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.797472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fv69\" (UniqueName: \"kubernetes.io/projected/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-kube-api-access-2fv69\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:13.877033 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.877002 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" Apr 24 21:31:13.888469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.888449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl"] Apr 24 21:31:13.890792 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:13.890763 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5acaffdc_65cb_4b11_a572_3f3d38f308c0.slice/crio-9c0a70e44299fd0114e5426209c7526240b49d02854679e09bb849a9643bf38f WatchSource:0}: Error finding container 9c0a70e44299fd0114e5426209c7526240b49d02854679e09bb849a9643bf38f: Status 404 returned error can't find the container with id 9c0a70e44299fd0114e5426209c7526240b49d02854679e09bb849a9643bf38f Apr 24 21:31:13.890889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.890810 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-crpwz" Apr 24 21:31:13.995135 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:13.995110 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr"] Apr 24 21:31:13.999197 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:13.999167 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23da5cf6_4806_441a_8d04_9ddc0c84d07b.slice/crio-bbf3a9c17e4f3667e08a60603cd0a52e1d0445b83b455e12daa848834e2f4699 WatchSource:0}: Error finding container bbf3a9c17e4f3667e08a60603cd0a52e1d0445b83b455e12daa848834e2f4699: Status 404 returned error can't find the container with id bbf3a9c17e4f3667e08a60603cd0a52e1d0445b83b455e12daa848834e2f4699 Apr 24 21:31:14.006958 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.006935 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-crpwz"] Apr 24 21:31:14.010644 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:14.010623 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32477472_4713_476c_ac5d_4d2a735ad4b7.slice/crio-9454ca6053ba4696df1a0954961d75b663343e100e0a25a8b6305624558deb59 WatchSource:0}: Error finding container 9454ca6053ba4696df1a0954961d75b663343e100e0a25a8b6305624558deb59: Status 404 returned error can't find the container with id 9454ca6053ba4696df1a0954961d75b663343e100e0a25a8b6305624558deb59 Apr 24 21:31:14.289649 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.289618 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:14.289797 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.289671 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:14.289797 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:14.289755 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:14.289797 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:14.289777 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:15.289760657 +0000 UTC m=+159.831712238 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : configmap references non-existent config key: service-ca.crt Apr 24 21:31:14.289953 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.289807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:14.289953 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:14.289885 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:15.289877977 +0000 UTC m=+159.831829554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:14.289953 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:14.289917 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:31:14.290064 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:14.289974 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:15.289957412 +0000 UTC m=+159.831909007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : secret "router-metrics-certs-default" not found Apr 24 21:31:14.358255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.358226 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" event={"ID":"23da5cf6-4806-441a-8d04-9ddc0c84d07b","Type":"ContainerStarted","Data":"bbf3a9c17e4f3667e08a60603cd0a52e1d0445b83b455e12daa848834e2f4699"} Apr 24 21:31:14.359194 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.359171 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-crpwz" event={"ID":"32477472-4713-476c-ac5d-4d2a735ad4b7","Type":"ContainerStarted","Data":"9454ca6053ba4696df1a0954961d75b663343e100e0a25a8b6305624558deb59"} Apr 24 21:31:14.359968 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:14.359946 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" event={"ID":"5acaffdc-65cb-4b11-a572-3f3d38f308c0","Type":"ContainerStarted","Data":"9c0a70e44299fd0114e5426209c7526240b49d02854679e09bb849a9643bf38f"} Apr 24 21:31:15.300957 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:15.300779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:15.300957 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:15.300886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:15.300957 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:15.300925 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.300902377 +0000 UTC m=+161.842853967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : configmap references non-existent config key: service-ca.crt Apr 24 21:31:15.301560 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:15.300985 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:15.301560 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:15.301039 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.301023266 +0000 UTC m=+161.842974847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:15.301560 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:15.300982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:15.301560 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:15.301055 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:31:15.301560 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:15.301101 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:17.301086452 +0000 UTC m=+161.843038034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : secret "router-metrics-certs-default" not found Apr 24 21:31:16.106914 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:16.106878 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:31:16.107094 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:16.106998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") pod \"image-registry-8fbf6b564-dm5w4\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:31:16.107094 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.107001 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:31:16.107202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.107122 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert podName:6b55e3f0-b53c-4afe-88de-6b0ada988fc9 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:18.10710126 +0000 UTC m=+282.649052846 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f8hxj" (UID: "6b55e3f0-b53c-4afe-88de-6b0ada988fc9") : secret "networking-console-plugin-cert" not found Apr 24 21:31:16.107202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.107058 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:31:16.107202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.107144 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8fbf6b564-dm5w4: secret "image-registry-tls" not found Apr 24 21:31:16.107202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.107190 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls podName:e10024bb-2bc4-4af2-987b-a3c774bd83a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:18.107176722 +0000 UTC m=+282.649128303 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls") pod "image-registry-8fbf6b564-dm5w4" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1") : secret "image-registry-tls" not found Apr 24 21:31:16.207427 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:16.207347 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:31:16.207427 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:16.207387 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:31:16.207659 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.207487 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:31:16.207659 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.207486 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:31:16.207659 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.207561 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert podName:3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f nodeName:}" failed. No retries permitted until 2026-04-24 21:33:18.207543282 +0000 UTC m=+282.749494862 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert") pod "ingress-canary-rrff5" (UID: "3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f") : secret "canary-serving-cert" not found Apr 24 21:31:16.207659 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:16.207603 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls podName:6a5a230a-7dee-4c4a-882c-d0cb1e017e43 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:18.207585058 +0000 UTC m=+282.749536642 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls") pod "dns-default-5tp2b" (UID: "6a5a230a-7dee-4c4a-882c-d0cb1e017e43") : secret "dns-default-metrics-tls" not found Apr 24 21:31:17.317251 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.317216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.317268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.317290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:17.317457 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:17.317484 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:21.317463802 +0000 UTC m=+165.859415380 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : configmap references non-existent config key: service-ca.crt Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:17.317511 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:21.317501516 +0000 UTC m=+165.859453105 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : secret "router-metrics-certs-default" not found Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:17.317462 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:17.317760 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:17.317561 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:21.31754898 +0000 UTC m=+165.859500560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:17.368164 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.368130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" event={"ID":"23da5cf6-4806-441a-8d04-9ddc0c84d07b","Type":"ContainerStarted","Data":"090124b397771b8a5c2e6d0207d9a9f4eceb3eb11e4c4bc8508ced3bd6d9fb5a"} Apr 24 21:31:17.369489 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.369459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-crpwz" event={"ID":"32477472-4713-476c-ac5d-4d2a735ad4b7","Type":"ContainerStarted","Data":"48b2458b3158ec82c8b2f168a11e30c39bfb6d8aa7f2e81d5434244b49192f7a"} Apr 24 21:31:17.370701 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.370672 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" event={"ID":"5acaffdc-65cb-4b11-a572-3f3d38f308c0","Type":"ContainerStarted","Data":"3413f3884d5ddf16b6b119447d99c26e79f7b586b0336af1c37b9cf5fb803998"} Apr 24 21:31:17.386167 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.386122 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" podStartSLOduration=1.971199866 podStartE2EDuration="4.386109073s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.001809284 +0000 UTC m=+158.543760861" lastFinishedPulling="2026-04-24 21:31:16.416718474 +0000 UTC m=+160.958670068" observedRunningTime="2026-04-24 21:31:17.38542663 +0000 UTC m=+161.927378231" watchObservedRunningTime="2026-04-24 21:31:17.386109073 +0000 UTC m=+161.928060696" Apr 24 21:31:17.407705 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.407666 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-crpwz" podStartSLOduration=2.007093311 podStartE2EDuration="4.407654036s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.012382314 +0000 UTC m=+158.554333894" lastFinishedPulling="2026-04-24 21:31:16.412943042 +0000 UTC m=+160.954894619" observedRunningTime="2026-04-24 21:31:17.40680962 +0000 UTC m=+161.948761220" watchObservedRunningTime="2026-04-24 21:31:17.407654036 +0000 UTC m=+161.949605634" Apr 24 21:31:17.427739 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.427688 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5hcrl" podStartSLOduration=1.913291612 podStartE2EDuration="4.427673167s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:13.892420867 +0000 UTC m=+158.434372444" lastFinishedPulling="2026-04-24 21:31:16.406802413 +0000 UTC m=+160.948753999" observedRunningTime="2026-04-24 21:31:17.426407901 +0000 UTC m=+161.968359500" watchObservedRunningTime="2026-04-24 21:31:17.427673167 +0000 UTC m=+161.969624767" Apr 24 21:31:17.657314 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.657286 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27"] Apr 24 21:31:17.660321 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.660305 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" Apr 24 21:31:17.664111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.663899 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:31:17.664111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.663973 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qxsrn\"" Apr 24 21:31:17.664111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.664081 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:31:17.671861 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.671841 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27"] Apr 24 21:31:17.821535 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.821495 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv467\" (UniqueName: \"kubernetes.io/projected/090e54a8-26ef-4bf1-ae67-483015286e44-kube-api-access-gv467\") pod \"migrator-74bb7799d9-v2n27\" (UID: \"090e54a8-26ef-4bf1-ae67-483015286e44\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" Apr 24 21:31:17.922034 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.921972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv467\" (UniqueName: \"kubernetes.io/projected/090e54a8-26ef-4bf1-ae67-483015286e44-kube-api-access-gv467\") pod \"migrator-74bb7799d9-v2n27\" (UID: \"090e54a8-26ef-4bf1-ae67-483015286e44\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" Apr 24 21:31:17.929961 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.929935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv467\" (UniqueName: \"kubernetes.io/projected/090e54a8-26ef-4bf1-ae67-483015286e44-kube-api-access-gv467\") pod \"migrator-74bb7799d9-v2n27\" (UID: \"090e54a8-26ef-4bf1-ae67-483015286e44\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" Apr 24 21:31:17.970103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:17.970083 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" Apr 24 21:31:18.085876 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:18.085848 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27"] Apr 24 21:31:18.089061 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:18.089024 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090e54a8_26ef_4bf1_ae67_483015286e44.slice/crio-3499e5a998ad84e3999539a8f28b1d143cd3c58aeefe6d54a44dd56aea261dbc WatchSource:0}: Error finding container 3499e5a998ad84e3999539a8f28b1d143cd3c58aeefe6d54a44dd56aea261dbc: Status 404 returned error can't find the container with id 3499e5a998ad84e3999539a8f28b1d143cd3c58aeefe6d54a44dd56aea261dbc Apr 24 21:31:18.377720 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:18.377676 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" event={"ID":"090e54a8-26ef-4bf1-ae67-483015286e44","Type":"ContainerStarted","Data":"3499e5a998ad84e3999539a8f28b1d143cd3c58aeefe6d54a44dd56aea261dbc"} Apr 24 21:31:19.382485 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:19.382456 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" event={"ID":"090e54a8-26ef-4bf1-ae67-483015286e44","Type":"ContainerStarted","Data":"734c01865c720f3246bfb42c97b2f8b78754d3a0e9b4b40af045a0e5556b5020"} Apr 24 21:31:19.382782 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:19.382493 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" event={"ID":"090e54a8-26ef-4bf1-ae67-483015286e44","Type":"ContainerStarted","Data":"c5e3a5e48b3a489d09cadf4ed0b7af634c7d75c1720a1290843b566449164a6b"} Apr 24 21:31:19.398621 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:19.398583 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-v2n27" podStartSLOduration=1.289850339 podStartE2EDuration="2.398572261s" podCreationTimestamp="2026-04-24 21:31:17 +0000 UTC" firstStartedPulling="2026-04-24 21:31:18.091433307 +0000 UTC m=+162.633384893" lastFinishedPulling="2026-04-24 21:31:19.200155221 +0000 UTC m=+163.742106815" observedRunningTime="2026-04-24 21:31:19.39765954 +0000 UTC m=+163.939611139" watchObservedRunningTime="2026-04-24 21:31:19.398572261 +0000 UTC m=+163.940523859" Apr 24 21:31:21.239573 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:21.239544 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xdhfk_f3c0ee91-65bf-4825-9def-c62c4806cc59/dns-node-resolver/0.log" Apr 24 21:31:21.350867 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:21.350836 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:21.351004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:21.350884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:21.351004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:21.350910 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:21.351090 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:21.351000 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:21.351090 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:21.351009 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:31:21.351090 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:21.351055 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:29.351040652 +0000 UTC m=+173.892992229 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:21.351090 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:21.351070 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:29.351063608 +0000 UTC m=+173.893015184 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : secret "router-metrics-certs-default" not found Apr 24 21:31:21.351227 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:21.351104 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle podName:68ab58ce-0982-4fa9-91b7-027cc0ef3bb7 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:29.351091696 +0000 UTC m=+173.893043278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle") pod "router-default-6fb67cdcfb-x2njp" (UID: "68ab58ce-0982-4fa9-91b7-027cc0ef3bb7") : configmap references non-existent config key: service-ca.crt Apr 24 21:31:22.242891 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:22.242862 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qj6pk_58290685-bbba-48e3-936e-34b4f4d27034/node-ca/0.log" Apr 24 21:31:23.239939 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:23.239911 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v2n27_090e54a8-26ef-4bf1-ae67-483015286e44/migrator/0.log" Apr 24 21:31:23.439235 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:23.439206 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v2n27_090e54a8-26ef-4bf1-ae67-483015286e44/graceful-termination/0.log" Apr 24 21:31:23.644457 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:23.644431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f22xr_23da5cf6-4806-441a-8d04-9ddc0c84d07b/kube-storage-version-migrator-operator/0.log" Apr 24 21:31:24.977704 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:24.977620 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:31:29.414985 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.414953 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:29.415409 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.415002 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:29.415409 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.415024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:29.415409 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:29.415170 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:29.415409 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:29.415249 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls podName:51955aae-3c73-4c5a-9e8c-e93d7e1ed29d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:45.415226708 +0000 UTC m=+189.957178287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tjtvx" (UID: "51955aae-3c73-4c5a-9e8c-e93d7e1ed29d") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:31:29.415649 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.415507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-service-ca-bundle\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:29.417169 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.417152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/68ab58ce-0982-4fa9-91b7-027cc0ef3bb7-metrics-certs\") pod \"router-default-6fb67cdcfb-x2njp\" (UID: \"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7\") " pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:29.497222 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.497197 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:29.611340 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:29.611302 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6fb67cdcfb-x2njp"] Apr 24 21:31:29.614546 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:29.614499 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ab58ce_0982_4fa9_91b7_027cc0ef3bb7.slice/crio-65f2f63a6da44fc98110bcc9c1e140e4f9960306e74e039a8a8722b9d8d617f3 WatchSource:0}: Error finding container 65f2f63a6da44fc98110bcc9c1e140e4f9960306e74e039a8a8722b9d8d617f3: Status 404 returned error can't find the container with id 65f2f63a6da44fc98110bcc9c1e140e4f9960306e74e039a8a8722b9d8d617f3 Apr 24 21:31:30.411821 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:30.411782 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" event={"ID":"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7","Type":"ContainerStarted","Data":"cbf33f83f7fea0a72dad0c3ca16643a37cdf82afeecf449677a9c15067668af1"} Apr 24 21:31:30.411821 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:30.411821 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" event={"ID":"68ab58ce-0982-4fa9-91b7-027cc0ef3bb7","Type":"ContainerStarted","Data":"65f2f63a6da44fc98110bcc9c1e140e4f9960306e74e039a8a8722b9d8d617f3"} Apr 24 21:31:30.434944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:30.434900 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" podStartSLOduration=17.434886892 podStartE2EDuration="17.434886892s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:31:30.43350954 +0000 UTC m=+174.975461154" watchObservedRunningTime="2026-04-24 21:31:30.434886892 +0000 UTC m=+174.976838491" Apr 24 21:31:30.497306 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:30.497278 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:30.499853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:30.499828 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:31.414539 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:31.414495 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:31.415715 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:31.415696 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6fb67cdcfb-x2njp" Apr 24 21:31:45.422212 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:45.422173 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:45.424689 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:45.424660 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/51955aae-3c73-4c5a-9e8c-e93d7e1ed29d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tjtvx\" (UID: \"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:45.687287 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:45.687200 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xsb94\"" Apr 24 21:31:45.694824 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:45.694797 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" Apr 24 21:31:45.805381 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:45.805353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx"] Apr 24 21:31:45.808974 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:45.808941 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51955aae_3c73_4c5a_9e8c_e93d7e1ed29d.slice/crio-1e6fbf5e2b9b545a3c4308fb06ed36e7c3573bf73afc088e52a028ad2c9a7882 WatchSource:0}: Error finding container 1e6fbf5e2b9b545a3c4308fb06ed36e7c3573bf73afc088e52a028ad2c9a7882: Status 404 returned error can't find the container with id 1e6fbf5e2b9b545a3c4308fb06ed36e7c3573bf73afc088e52a028ad2c9a7882 Apr 24 21:31:46.456201 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.456159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" event={"ID":"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d","Type":"ContainerStarted","Data":"1e6fbf5e2b9b545a3c4308fb06ed36e7c3573bf73afc088e52a028ad2c9a7882"} Apr 24 21:31:46.492214 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.492185 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrc6"] Apr 24 21:31:46.494448 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.494424 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.498359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.498329 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:31:46.499123 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.499101 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r5h69\"" Apr 24 21:31:46.499330 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.499316 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:31:46.508865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.508837 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrc6"] Apr 24 21:31:46.531722 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.531693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29129b49-ae5a-46a2-b952-958acbcd5d52-data-volume\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.531829 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.531737 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29129b49-ae5a-46a2-b952-958acbcd5d52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.531899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.531830 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rcmd\" (UniqueName: \"kubernetes.io/projected/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-api-access-2rcmd\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.531953 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.531922 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.532007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.531991 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29129b49-ae5a-46a2-b952-958acbcd5d52-crio-socket\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.632763 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632733 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.632926 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29129b49-ae5a-46a2-b952-958acbcd5d52-crio-socket\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.632926 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632920 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29129b49-ae5a-46a2-b952-958acbcd5d52-data-volume\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.633042 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632952 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29129b49-ae5a-46a2-b952-958acbcd5d52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.633042 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632998 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rcmd\" (UniqueName: \"kubernetes.io/projected/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-api-access-2rcmd\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.633268 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.633243 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/29129b49-ae5a-46a2-b952-958acbcd5d52-data-volume\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.633362 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.632949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/29129b49-ae5a-46a2-b952-958acbcd5d52-crio-socket\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.633425 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.633398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.636497 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.636476 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/29129b49-ae5a-46a2-b952-958acbcd5d52-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.642366 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.642332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rcmd\" (UniqueName: \"kubernetes.io/projected/29129b49-ae5a-46a2-b952-958acbcd5d52-kube-api-access-2rcmd\") pod \"insights-runtime-extractor-ftrc6\" (UID: \"29129b49-ae5a-46a2-b952-958acbcd5d52\") " pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.806339 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.806264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ftrc6" Apr 24 21:31:46.943882 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:46.943843 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ftrc6"] Apr 24 21:31:47.308909 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:47.308875 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29129b49_ae5a_46a2_b952_958acbcd5d52.slice/crio-87e95fcfefe4fdbd6f53ebebe97de80309efb41a59bda62f538271a265832c0b WatchSource:0}: Error finding container 87e95fcfefe4fdbd6f53ebebe97de80309efb41a59bda62f538271a265832c0b: Status 404 returned error can't find the container with id 87e95fcfefe4fdbd6f53ebebe97de80309efb41a59bda62f538271a265832c0b Apr 24 21:31:47.460943 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.460891 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" event={"ID":"51955aae-3c73-4c5a-9e8c-e93d7e1ed29d","Type":"ContainerStarted","Data":"71d21c72491b057e3abddec4b1f33266123002795f0ad0b84cba73b20a19a976"} Apr 24 21:31:47.462412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.462371 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrc6" event={"ID":"29129b49-ae5a-46a2-b952-958acbcd5d52","Type":"ContainerStarted","Data":"fc0a31846a6a02ac807d7dfb0edd4972a9a37f24570307298c36150c5159a268"} Apr 24 21:31:47.462412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.462410 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrc6" event={"ID":"29129b49-ae5a-46a2-b952-958acbcd5d52","Type":"ContainerStarted","Data":"87e95fcfefe4fdbd6f53ebebe97de80309efb41a59bda62f538271a265832c0b"} Apr 24 21:31:47.478025 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.477975 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tjtvx" podStartSLOduration=32.952744906 podStartE2EDuration="34.477958708s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:45.810714207 +0000 UTC m=+190.352665785" lastFinishedPulling="2026-04-24 21:31:47.335927996 +0000 UTC m=+191.877879587" observedRunningTime="2026-04-24 21:31:47.476551123 +0000 UTC m=+192.018502723" watchObservedRunningTime="2026-04-24 21:31:47.477958708 +0000 UTC m=+192.019910309" Apr 24 21:31:47.822669 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.822642 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq"] Apr 24 21:31:47.824490 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.824474 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:47.826630 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.826584 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:31:47.826748 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.826600 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-qmzb7\"" Apr 24 21:31:47.833192 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.833175 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq"] Apr 24 21:31:47.943469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:47.943444 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gmklq\" (UID: \"35b673bf-f435-46d8-a296-c4719d80ee9d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:48.044785 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:48.044757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gmklq\" (UID: \"35b673bf-f435-46d8-a296-c4719d80ee9d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:48.044915 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:48.044897 2567 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 21:31:48.044971 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:48.044960 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates podName:35b673bf-f435-46d8-a296-c4719d80ee9d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:48.54494159 +0000 UTC m=+193.086893168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-gmklq" (UID: "35b673bf-f435-46d8-a296-c4719d80ee9d") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 21:31:48.466909 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:48.466859 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrc6" event={"ID":"29129b49-ae5a-46a2-b952-958acbcd5d52","Type":"ContainerStarted","Data":"51bdde078f9daef847e562d6b7236e7ba678f41cfabef60a74c0846363a20e18"} Apr 24 21:31:48.549840 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:48.549801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gmklq\" (UID: \"35b673bf-f435-46d8-a296-c4719d80ee9d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:48.552565 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:48.552539 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/35b673bf-f435-46d8-a296-c4719d80ee9d-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-gmklq\" (UID: \"35b673bf-f435-46d8-a296-c4719d80ee9d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:48.734564 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:48.734476 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:49.171151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:49.171129 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq"] Apr 24 21:31:49.174007 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:49.173985 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b673bf_f435_46d8_a296_c4719d80ee9d.slice/crio-ae4fcaf3892efc829e1dd30b14009477a8b2ebbdd23c77fb3434bac18344dd34 WatchSource:0}: Error finding container ae4fcaf3892efc829e1dd30b14009477a8b2ebbdd23c77fb3434bac18344dd34: Status 404 returned error can't find the container with id ae4fcaf3892efc829e1dd30b14009477a8b2ebbdd23c77fb3434bac18344dd34 Apr 24 21:31:49.470661 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:49.470631 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ftrc6" event={"ID":"29129b49-ae5a-46a2-b952-958acbcd5d52","Type":"ContainerStarted","Data":"7e683a614aa000b190d822a3037ddfbe97bb96c980fded7c717bf328c4064da2"} Apr 24 21:31:49.471607 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:49.471586 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" event={"ID":"35b673bf-f435-46d8-a296-c4719d80ee9d","Type":"ContainerStarted","Data":"ae4fcaf3892efc829e1dd30b14009477a8b2ebbdd23c77fb3434bac18344dd34"} Apr 24 21:31:49.490510 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:49.490466 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ftrc6" podStartSLOduration=1.766060161 podStartE2EDuration="3.490453181s" podCreationTimestamp="2026-04-24 21:31:46 +0000 UTC" firstStartedPulling="2026-04-24 21:31:47.371657286 +0000 UTC m=+191.913608878" lastFinishedPulling="2026-04-24 21:31:49.096050312 +0000 UTC m=+193.638001898" observedRunningTime="2026-04-24 21:31:49.488980412 +0000 UTC m=+194.030932011" watchObservedRunningTime="2026-04-24 21:31:49.490453181 +0000 UTC m=+194.032404782" Apr 24 21:31:50.477243 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.477134 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" event={"ID":"35b673bf-f435-46d8-a296-c4719d80ee9d","Type":"ContainerStarted","Data":"283a37a78bb577cd9e880f244c41b95ece5ca255949ff510e40089463efda4c8"} Apr 24 21:31:50.477668 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.477457 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:50.482264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.482233 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" Apr 24 21:31:50.495636 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.495591 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-gmklq" podStartSLOduration=2.459348387 podStartE2EDuration="3.495578556s" podCreationTimestamp="2026-04-24 21:31:47 +0000 UTC" firstStartedPulling="2026-04-24 21:31:49.175785859 +0000 UTC m=+193.717737436" lastFinishedPulling="2026-04-24 21:31:50.212016011 +0000 UTC m=+194.753967605" observedRunningTime="2026-04-24 21:31:50.494626859 +0000 UTC m=+195.036578469" watchObservedRunningTime="2026-04-24 21:31:50.495578556 +0000 UTC m=+195.037530154" Apr 24 21:31:50.917379 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.917348 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v8qkr"] Apr 24 21:31:50.919613 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.919590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:50.923201 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.923183 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:31:50.923341 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.923324 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:31:50.924021 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.924005 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:31:50.924076 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.924049 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-hvfhr\"" Apr 24 21:31:50.942104 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.942076 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v8qkr"] Apr 24 21:31:50.968502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.968477 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:50.968611 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.968515 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/700d16c7-e5f3-4120-acf4-338514077b97-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:50.968611 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.968592 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nzp\" (UniqueName: \"kubernetes.io/projected/700d16c7-e5f3-4120-acf4-338514077b97-kube-api-access-54nzp\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:50.968699 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:50.968634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.069781 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.069754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/700d16c7-e5f3-4120-acf4-338514077b97-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.069906 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.069790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54nzp\" (UniqueName: \"kubernetes.io/projected/700d16c7-e5f3-4120-acf4-338514077b97-kube-api-access-54nzp\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.069906 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.069828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.069977 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.069922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.070359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.070338 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/700d16c7-e5f3-4120-acf4-338514077b97-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.072170 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.072146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.072348 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.072326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/700d16c7-e5f3-4120-acf4-338514077b97-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.079936 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.079918 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nzp\" (UniqueName: \"kubernetes.io/projected/700d16c7-e5f3-4120-acf4-338514077b97-kube-api-access-54nzp\") pod \"prometheus-operator-5676c8c784-v8qkr\" (UID: \"700d16c7-e5f3-4120-acf4-338514077b97\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.228565 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.228506 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" Apr 24 21:31:51.337490 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.337458 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-v8qkr"] Apr 24 21:31:51.340229 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:51.340204 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod700d16c7_e5f3_4120_acf4_338514077b97.slice/crio-6ecfbbc0d2c9f6455287ec6103359ef045dc419f772595bfadcfa8ce4364ecfd WatchSource:0}: Error finding container 6ecfbbc0d2c9f6455287ec6103359ef045dc419f772595bfadcfa8ce4364ecfd: Status 404 returned error can't find the container with id 6ecfbbc0d2c9f6455287ec6103359ef045dc419f772595bfadcfa8ce4364ecfd Apr 24 21:31:51.480433 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:51.480376 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" event={"ID":"700d16c7-e5f3-4120-acf4-338514077b97","Type":"ContainerStarted","Data":"6ecfbbc0d2c9f6455287ec6103359ef045dc419f772595bfadcfa8ce4364ecfd"} Apr 24 21:31:53.486840 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:53.486804 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" event={"ID":"700d16c7-e5f3-4120-acf4-338514077b97","Type":"ContainerStarted","Data":"d54064db467f2f5f00e93a1ab7b03b285c7a73b5a5388a218f9f232b5a61b177"} Apr 24 21:31:53.486840 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:53.486839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" event={"ID":"700d16c7-e5f3-4120-acf4-338514077b97","Type":"ContainerStarted","Data":"65b0f4c4a5ac516c1b681d801ef55bec0cda84fcb51dc69fe1f5148ce07908e7"} Apr 24 21:31:53.511002 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:53.510951 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-v8qkr" podStartSLOduration=2.273064147 podStartE2EDuration="3.510939066s" podCreationTimestamp="2026-04-24 21:31:50 +0000 UTC" firstStartedPulling="2026-04-24 21:31:51.342458753 +0000 UTC m=+195.884410330" lastFinishedPulling="2026-04-24 21:31:52.580333655 +0000 UTC m=+197.122285249" observedRunningTime="2026-04-24 21:31:53.509603217 +0000 UTC m=+198.051554831" watchObservedRunningTime="2026-04-24 21:31:53.510939066 +0000 UTC m=+198.052890665" Apr 24 21:31:55.280828 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.280795 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2bmn2"] Apr 24 21:31:55.283336 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.283314 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.286142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.286123 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zmm79\"" Apr 24 21:31:55.286338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.286325 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:31:55.286594 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.286570 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:31:55.286804 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.286780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:31:55.289204 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.289185 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ws6r5"] Apr 24 21:31:55.292199 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.292180 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.296852 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.296833 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 21:31:55.296961 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.296865 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-8rjws\"" Apr 24 21:31:55.296961 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.296871 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 21:31:55.296961 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.296943 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 21:31:55.305565 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.305543 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ws6r5"] Apr 24 21:31:55.406890 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.406867 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-root\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407012 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.406917 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-textfile\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407012 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.406939 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8ab5b171-1976-4834-9bdc-b77f8f597ceb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407012 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.406979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2mzt\" (UniqueName: \"kubernetes.io/projected/f029b748-78df-447d-8158-9a6cca578bb4-kube-api-access-p2mzt\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407015 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407033 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-tls\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407049 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407067 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-wtmp\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407140 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407121 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn84h\" (UniqueName: \"kubernetes.io/projected/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-api-access-xn84h\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407156 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-sys\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407184 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407199 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.407357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.407357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.407238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-metrics-client-ca\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508529 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508507 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.508651 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.508651 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508571 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508651 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508590 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-metrics-client-ca\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508651 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-root\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:55.508647 2567 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-textfile\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508689 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8ab5b171-1976-4834-9bdc-b77f8f597ceb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:31:55.508721 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls podName:8ab5b171-1976-4834-9bdc-b77f8f597ceb nodeName:}" failed. No retries permitted until 2026-04-24 21:31:56.008702409 +0000 UTC m=+200.550654003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-ws6r5" (UID: "8ab5b171-1976-4834-9bdc-b77f8f597ceb") : secret "kube-state-metrics-tls" not found Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508726 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-root\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508774 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2mzt\" (UniqueName: \"kubernetes.io/projected/f029b748-78df-447d-8158-9a6cca578bb4-kube-api-access-p2mzt\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-tls\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.508899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508904 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-wtmp\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xn84h\" (UniqueName: \"kubernetes.io/projected/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-api-access-xn84h\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.508999 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8ab5b171-1976-4834-9bdc-b77f8f597ceb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-sys\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509057 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-textfile\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-sys\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-wtmp\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509296 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.509396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-accelerators-collector-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.509815 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ab5b171-1976-4834-9bdc-b77f8f597ceb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.509815 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.509761 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f029b748-78df-447d-8158-9a6cca578bb4-metrics-client-ca\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.511380 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.511363 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.511725 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.511708 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-tls\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.511829 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.511811 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f029b748-78df-447d-8158-9a6cca578bb4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.517830 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.517806 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn84h\" (UniqueName: \"kubernetes.io/projected/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-api-access-xn84h\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:55.519031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.519008 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2mzt\" (UniqueName: \"kubernetes.io/projected/f029b748-78df-447d-8158-9a6cca578bb4-kube-api-access-p2mzt\") pod \"node-exporter-2bmn2\" (UID: \"f029b748-78df-447d-8158-9a6cca578bb4\") " pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.592222 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:55.592174 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2bmn2" Apr 24 21:31:55.599479 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:55.599457 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf029b748_78df_447d_8158_9a6cca578bb4.slice/crio-1d6843ed361f114ca493e87d3b73e35e4bf607baca7795939d81adf9b35b6ca0 WatchSource:0}: Error finding container 1d6843ed361f114ca493e87d3b73e35e4bf607baca7795939d81adf9b35b6ca0: Status 404 returned error can't find the container with id 1d6843ed361f114ca493e87d3b73e35e4bf607baca7795939d81adf9b35b6ca0 Apr 24 21:31:56.013887 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.013854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:56.016169 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.016142 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8ab5b171-1976-4834-9bdc-b77f8f597ceb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ws6r5\" (UID: \"8ab5b171-1976-4834-9bdc-b77f8f597ceb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:56.200664 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.200633 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" Apr 24 21:31:56.446024 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.445991 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ws6r5"] Apr 24 21:31:56.448570 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:31:56.448546 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab5b171_1976_4834_9bdc_b77f8f597ceb.slice/crio-b570f7d7589d358eed58302b2c234d0a0a85f4a1c2ea9240467046b4902d268a WatchSource:0}: Error finding container b570f7d7589d358eed58302b2c234d0a0a85f4a1c2ea9240467046b4902d268a: Status 404 returned error can't find the container with id b570f7d7589d358eed58302b2c234d0a0a85f4a1c2ea9240467046b4902d268a Apr 24 21:31:56.493774 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.493740 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" event={"ID":"8ab5b171-1976-4834-9bdc-b77f8f597ceb","Type":"ContainerStarted","Data":"b570f7d7589d358eed58302b2c234d0a0a85f4a1c2ea9240467046b4902d268a"} Apr 24 21:31:56.495042 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.495019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bmn2" event={"ID":"f029b748-78df-447d-8158-9a6cca578bb4","Type":"ContainerStarted","Data":"d92041ce51254c0fcaa92ea55cdfc83179db8218b35327f3443d8379f916fb75"} Apr 24 21:31:56.495115 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:56.495051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bmn2" event={"ID":"f029b748-78df-447d-8158-9a6cca578bb4","Type":"ContainerStarted","Data":"1d6843ed361f114ca493e87d3b73e35e4bf607baca7795939d81adf9b35b6ca0"} Apr 24 21:31:57.499473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:57.499433 2567 generic.go:358] "Generic (PLEG): container finished" podID="f029b748-78df-447d-8158-9a6cca578bb4" containerID="d92041ce51254c0fcaa92ea55cdfc83179db8218b35327f3443d8379f916fb75" exitCode=0 Apr 24 21:31:57.499876 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:57.499494 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bmn2" event={"ID":"f029b748-78df-447d-8158-9a6cca578bb4","Type":"ContainerDied","Data":"d92041ce51254c0fcaa92ea55cdfc83179db8218b35327f3443d8379f916fb75"} Apr 24 21:31:58.506434 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.505200 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bmn2" event={"ID":"f029b748-78df-447d-8158-9a6cca578bb4","Type":"ContainerStarted","Data":"c48cd94b74573e30200b20238c8eb136b99b5fae4b70affd286466c6d1d3e774"} Apr 24 21:31:58.506434 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.505242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2bmn2" event={"ID":"f029b748-78df-447d-8158-9a6cca578bb4","Type":"ContainerStarted","Data":"918f554af1ec6a19a2009a6f564030fe1c179719edfbb17506151f8f14d2f670"} Apr 24 21:31:58.508561 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.508534 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" event={"ID":"8ab5b171-1976-4834-9bdc-b77f8f597ceb","Type":"ContainerStarted","Data":"544d6cac82761c6a56335ac20da5afba252cc5242bca9bfd7fe319cf5c4d11a2"} Apr 24 21:31:58.508695 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.508569 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" event={"ID":"8ab5b171-1976-4834-9bdc-b77f8f597ceb","Type":"ContainerStarted","Data":"3f262f98953b870e852df78e00d2a89fbe3dbefe6e8e01ebe47af8bc1383fa4f"} Apr 24 21:31:58.508695 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.508578 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" event={"ID":"8ab5b171-1976-4834-9bdc-b77f8f597ceb","Type":"ContainerStarted","Data":"278f5e1b58f62f8f213886ec742fafd22f699dbc6b3ebf50105d7eb77fb2030d"} Apr 24 21:31:58.531002 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.530936 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2bmn2" podStartSLOduration=2.787953287 podStartE2EDuration="3.530923159s" podCreationTimestamp="2026-04-24 21:31:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:55.601243963 +0000 UTC m=+200.143195546" lastFinishedPulling="2026-04-24 21:31:56.344213839 +0000 UTC m=+200.886165418" observedRunningTime="2026-04-24 21:31:58.53014043 +0000 UTC m=+203.072092030" watchObservedRunningTime="2026-04-24 21:31:58.530923159 +0000 UTC m=+203.072874758" Apr 24 21:31:58.558348 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:58.558308 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-ws6r5" podStartSLOduration=2.417127536 podStartE2EDuration="3.558296323s" podCreationTimestamp="2026-04-24 21:31:55 +0000 UTC" firstStartedPulling="2026-04-24 21:31:56.450704826 +0000 UTC m=+200.992656408" lastFinishedPulling="2026-04-24 21:31:57.591873605 +0000 UTC m=+202.133825195" observedRunningTime="2026-04-24 21:31:58.557815566 +0000 UTC m=+203.099767200" watchObservedRunningTime="2026-04-24 21:31:58.558296323 +0000 UTC m=+203.100247921" Apr 24 21:31:59.747974 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.747919 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-68995464cf-q4lvq"] Apr 24 21:31:59.750179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.750160 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.753815 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.753796 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 21:31:59.753924 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.753815 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fhprhu8su3fld\"" Apr 24 21:31:59.754328 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.754302 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 21:31:59.754437 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.754345 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 21:31:59.754437 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.754355 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 21:31:59.754612 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.754595 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vb99f\"" Apr 24 21:31:59.764974 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.764953 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68995464cf-q4lvq"] Apr 24 21:31:59.844362 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844334 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0102fb1c-fbc0-464f-8967-c430cf0ee1df-audit-log\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-client-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-client-certs\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844624 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844509 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2md\" (UniqueName: \"kubernetes.io/projected/0102fb1c-fbc0-464f-8967-c430cf0ee1df-kube-api-access-5s2md\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844624 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844624 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-tls\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.844719 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.844625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-metrics-server-audit-profiles\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.945847 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.945819 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-tls\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946027 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.945870 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-metrics-server-audit-profiles\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946080 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946046 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0102fb1c-fbc0-464f-8967-c430cf0ee1df-audit-log\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946112 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-client-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946171 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-client-certs\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946235 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946219 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2md\" (UniqueName: \"kubernetes.io/projected/0102fb1c-fbc0-464f-8967-c430cf0ee1df-kube-api-access-5s2md\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946293 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946457 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0102fb1c-fbc0-464f-8967-c430cf0ee1df-audit-log\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.946854 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.946829 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.947292 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.947273 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0102fb1c-fbc0-464f-8967-c430cf0ee1df-metrics-server-audit-profiles\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.948347 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.948326 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-tls\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.948606 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.948584 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-client-ca-bundle\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.948733 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.948718 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0102fb1c-fbc0-464f-8967-c430cf0ee1df-secret-metrics-server-client-certs\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:31:59.954915 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:31:59.954892 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2md\" (UniqueName: \"kubernetes.io/projected/0102fb1c-fbc0-464f-8967-c430cf0ee1df-kube-api-access-5s2md\") pod \"metrics-server-68995464cf-q4lvq\" (UID: \"0102fb1c-fbc0-464f-8967-c430cf0ee1df\") " pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:00.061041 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:00.060353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:00.178498 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:00.178472 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68995464cf-q4lvq"] Apr 24 21:32:00.181699 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:32:00.181673 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0102fb1c_fbc0_464f_8967_c430cf0ee1df.slice/crio-4356cd99678a32c97edbae0d98ed5db419e1b5f578e48a63f9f61a7c669d1140 WatchSource:0}: Error finding container 4356cd99678a32c97edbae0d98ed5db419e1b5f578e48a63f9f61a7c669d1140: Status 404 returned error can't find the container with id 4356cd99678a32c97edbae0d98ed5db419e1b5f578e48a63f9f61a7c669d1140 Apr 24 21:32:00.514787 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:00.514754 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" event={"ID":"0102fb1c-fbc0-464f-8967-c430cf0ee1df","Type":"ContainerStarted","Data":"4356cd99678a32c97edbae0d98ed5db419e1b5f578e48a63f9f61a7c669d1140"} Apr 24 21:32:02.523723 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:02.523687 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" event={"ID":"0102fb1c-fbc0-464f-8967-c430cf0ee1df","Type":"ContainerStarted","Data":"674fa6c5877568a80055a89db44034c8d8d264cf023bef530fc6be68c3242a72"} Apr 24 21:32:02.548708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:02.548662 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" podStartSLOduration=2.205937825 podStartE2EDuration="3.548647141s" podCreationTimestamp="2026-04-24 21:31:59 +0000 UTC" firstStartedPulling="2026-04-24 21:32:00.183610071 +0000 UTC m=+204.725561661" lastFinishedPulling="2026-04-24 21:32:01.526319383 +0000 UTC m=+206.068270977" observedRunningTime="2026-04-24 21:32:02.546749925 +0000 UTC m=+207.088701523" watchObservedRunningTime="2026-04-24 21:32:02.548647141 +0000 UTC m=+207.090598739" Apr 24 21:32:08.476447 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.476410 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-qrhbm"] Apr 24 21:32:08.478404 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.478387 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:08.480604 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.480585 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:32:08.480719 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.480610 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wxmk2\"" Apr 24 21:32:08.480719 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.480653 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:32:08.490757 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.490737 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qrhbm"] Apr 24 21:32:08.617696 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.617670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2qf\" (UniqueName: \"kubernetes.io/projected/03c30a98-1967-4567-88ca-9f2cc397987a-kube-api-access-7h2qf\") pod \"downloads-6bcc868b7-qrhbm\" (UID: \"03c30a98-1967-4567-88ca-9f2cc397987a\") " pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:08.718981 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.718956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2qf\" (UniqueName: \"kubernetes.io/projected/03c30a98-1967-4567-88ca-9f2cc397987a-kube-api-access-7h2qf\") pod \"downloads-6bcc868b7-qrhbm\" (UID: \"03c30a98-1967-4567-88ca-9f2cc397987a\") " pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:08.727448 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.727388 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2qf\" (UniqueName: \"kubernetes.io/projected/03c30a98-1967-4567-88ca-9f2cc397987a-kube-api-access-7h2qf\") pod \"downloads-6bcc868b7-qrhbm\" (UID: \"03c30a98-1967-4567-88ca-9f2cc397987a\") " pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:08.787008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.786981 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:08.901853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:08.901826 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qrhbm"] Apr 24 21:32:08.904835 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:32:08.904805 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c30a98_1967_4567_88ca_9f2cc397987a.slice/crio-05caf55d9dd766ad14b7a510eabb87c1bf47e4b087079c09c00fe6c88ae155b1 WatchSource:0}: Error finding container 05caf55d9dd766ad14b7a510eabb87c1bf47e4b087079c09c00fe6c88ae155b1: Status 404 returned error can't find the container with id 05caf55d9dd766ad14b7a510eabb87c1bf47e4b087079c09c00fe6c88ae155b1 Apr 24 21:32:09.153000 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.152970 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8fbf6b564-dm5w4"] Apr 24 21:32:09.153180 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:32:09.153162 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" podUID="e10024bb-2bc4-4af2-987b-a3c774bd83a1" Apr 24 21:32:09.546009 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.545921 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:32:09.546417 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.546347 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qrhbm" event={"ID":"03c30a98-1967-4567-88ca-9f2cc397987a","Type":"ContainerStarted","Data":"05caf55d9dd766ad14b7a510eabb87c1bf47e4b087079c09c00fe6c88ae155b1"} Apr 24 21:32:09.553106 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.553078 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:32:09.728585 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728546 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728748 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728596 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728748 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728687 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728748 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728731 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728901 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728771 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgbz9\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728901 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728806 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.728901 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.728846 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted\") pod \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\" (UID: \"e10024bb-2bc4-4af2-987b-a3c774bd83a1\") " Apr 24 21:32:09.729387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.729359 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:32:09.729628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.729577 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.729628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.729588 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:32:09.731280 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.731250 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:09.731390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.731276 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:09.731390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.731333 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9" (OuterVolumeSpecName: "kube-api-access-lgbz9") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "kube-api-access-lgbz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:32:09.731473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.731412 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e10024bb-2bc4-4af2-987b-a3c774bd83a1" (UID: "e10024bb-2bc4-4af2-987b-a3c774bd83a1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:32:09.829944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829870 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-trusted-ca\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.829944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829898 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-image-registry-private-configuration\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.829944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829913 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgbz9\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-kube-api-access-lgbz9\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.829944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829928 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-certificates\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.829944 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829942 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e10024bb-2bc4-4af2-987b-a3c774bd83a1-ca-trust-extracted\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.830308 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829957 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e10024bb-2bc4-4af2-987b-a3c774bd83a1-installation-pull-secrets\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:09.830308 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:09.829971 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-bound-sa-token\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:10.549098 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:10.549067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8fbf6b564-dm5w4" Apr 24 21:32:10.580973 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:10.580943 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8fbf6b564-dm5w4"] Apr 24 21:32:10.585564 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:10.585539 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8fbf6b564-dm5w4"] Apr 24 21:32:10.737436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:10.737398 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e10024bb-2bc4-4af2-987b-a3c774bd83a1-registry-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:32:11.982838 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:11.982799 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10024bb-2bc4-4af2-987b-a3c774bd83a1" path="/var/lib/kubelet/pods/e10024bb-2bc4-4af2-987b-a3c774bd83a1/volumes" Apr 24 21:32:17.144710 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.144678 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:32:17.151569 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.151547 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.154708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154683 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 21:32:17.154708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154703 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 21:32:17.154885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154709 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 21:32:17.154885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154721 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 21:32:17.154885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154746 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 21:32:17.154885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.154687 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-z9wmp\"" Apr 24 21:32:17.169491 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.169470 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:32:17.196986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.196965 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.197096 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.196994 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.197096 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.197022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.197096 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.197052 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.197245 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.197158 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr67s\" (UniqueName: \"kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.197245 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.197200 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298452 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298418 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298466 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298504 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298783 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298632 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr67s\" (UniqueName: \"kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.298783 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.298666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.299432 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.299341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.299432 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.299366 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.299632 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.299548 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.301708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.301685 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.301708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.301699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.311064 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.311038 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr67s\" (UniqueName: \"kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s\") pod \"console-7645c6b5bf-bjxb9\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.462606 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.462537 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:17.613059 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:17.613029 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:32:17.616811 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:32:17.616782 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86788a9f_dce9_4c17_bc5a_212515744e3a.slice/crio-68b4299431eeacb12916e56c739b3d8634c651284f436b6b2e57d9e3f94bf526 WatchSource:0}: Error finding container 68b4299431eeacb12916e56c739b3d8634c651284f436b6b2e57d9e3f94bf526: Status 404 returned error can't find the container with id 68b4299431eeacb12916e56c739b3d8634c651284f436b6b2e57d9e3f94bf526 Apr 24 21:32:18.572965 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:18.572926 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7645c6b5bf-bjxb9" event={"ID":"86788a9f-dce9-4c17-bc5a-212515744e3a","Type":"ContainerStarted","Data":"68b4299431eeacb12916e56c739b3d8634c651284f436b6b2e57d9e3f94bf526"} Apr 24 21:32:20.061270 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:20.061236 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:20.061708 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:20.061278 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:24.592508 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.592475 2567 generic.go:358] "Generic (PLEG): container finished" podID="23da5cf6-4806-441a-8d04-9ddc0c84d07b" containerID="090124b397771b8a5c2e6d0207d9a9f4eceb3eb11e4c4bc8508ced3bd6d9fb5a" exitCode=0 Apr 24 21:32:24.592823 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.592552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" event={"ID":"23da5cf6-4806-441a-8d04-9ddc0c84d07b","Type":"ContainerDied","Data":"090124b397771b8a5c2e6d0207d9a9f4eceb3eb11e4c4bc8508ced3bd6d9fb5a"} Apr 24 21:32:24.592823 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.592813 2567 scope.go:117] "RemoveContainer" containerID="090124b397771b8a5c2e6d0207d9a9f4eceb3eb11e4c4bc8508ced3bd6d9fb5a" Apr 24 21:32:24.734939 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.734873 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:32:24.758883 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.758856 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:32:24.759021 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.758955 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.766227 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.766204 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 21:32:24.869982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.869951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.869982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.869982 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.870151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.870026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.870151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.870058 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.870151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.870074 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.870151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.870101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.870151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.870128 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvd8h\" (UniqueName: \"kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970656 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970631 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvd8h\" (UniqueName: \"kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970772 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970877 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970877 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970862 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970983 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.970983 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.970938 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.971647 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.971622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.971863 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.971843 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.972209 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.972160 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.973607 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.973557 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.973806 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.973767 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.974479 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.974458 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:24.979667 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:24.979646 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvd8h\" (UniqueName: \"kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h\") pod \"console-64bb4544d4-lpbvc\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:25.069591 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.069564 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:25.214989 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.214742 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:32:25.603396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.603348 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qrhbm" event={"ID":"03c30a98-1967-4567-88ca-9f2cc397987a","Type":"ContainerStarted","Data":"4a2db97d0d6f583a91070692218e9fa4cdab3c6ca252a3ecf622923be6dcee3f"} Apr 24 21:32:25.604939 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.604850 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:25.609233 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.608837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f22xr" event={"ID":"23da5cf6-4806-441a-8d04-9ddc0c84d07b","Type":"ContainerStarted","Data":"c62758d5d982e5ab84f810c4b7af57e4ade614d76bf21faa4c5547b8c41a554b"} Apr 24 21:32:25.612146 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.611951 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bb4544d4-lpbvc" event={"ID":"fa409b5c-090b-4796-99cc-14e50ee7f4d5","Type":"ContainerStarted","Data":"c716ca303a1b240938b9ef9efd6a2eb37ad5c61b2d2ec4e11ecc50f4c78dccf3"} Apr 24 21:32:25.620146 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.620119 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-qrhbm" Apr 24 21:32:25.654646 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:25.654574 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-qrhbm" podStartSLOduration=1.635385605 podStartE2EDuration="17.654556208s" podCreationTimestamp="2026-04-24 21:32:08 +0000 UTC" firstStartedPulling="2026-04-24 21:32:08.906671179 +0000 UTC m=+213.448622757" lastFinishedPulling="2026-04-24 21:32:24.925841765 +0000 UTC m=+229.467793360" observedRunningTime="2026-04-24 21:32:25.624579395 +0000 UTC m=+230.166530995" watchObservedRunningTime="2026-04-24 21:32:25.654556208 +0000 UTC m=+230.196507809" Apr 24 21:32:29.628276 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:29.628234 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bb4544d4-lpbvc" event={"ID":"fa409b5c-090b-4796-99cc-14e50ee7f4d5","Type":"ContainerStarted","Data":"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e"} Apr 24 21:32:29.630115 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:29.630085 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7645c6b5bf-bjxb9" event={"ID":"86788a9f-dce9-4c17-bc5a-212515744e3a","Type":"ContainerStarted","Data":"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561"} Apr 24 21:32:29.658434 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:29.658381 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64bb4544d4-lpbvc" podStartSLOduration=2.377826694 podStartE2EDuration="5.65836854s" podCreationTimestamp="2026-04-24 21:32:24 +0000 UTC" firstStartedPulling="2026-04-24 21:32:25.221482074 +0000 UTC m=+229.763433657" lastFinishedPulling="2026-04-24 21:32:28.502023921 +0000 UTC m=+233.043975503" observedRunningTime="2026-04-24 21:32:29.656646869 +0000 UTC m=+234.198598468" watchObservedRunningTime="2026-04-24 21:32:29.65836854 +0000 UTC m=+234.200320139" Apr 24 21:32:29.679541 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:29.679489 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7645c6b5bf-bjxb9" podStartSLOduration=1.803345498 podStartE2EDuration="12.679474791s" podCreationTimestamp="2026-04-24 21:32:17 +0000 UTC" firstStartedPulling="2026-04-24 21:32:17.618849134 +0000 UTC m=+222.160800715" lastFinishedPulling="2026-04-24 21:32:28.494978429 +0000 UTC m=+233.036930008" observedRunningTime="2026-04-24 21:32:29.679052819 +0000 UTC m=+234.221004451" watchObservedRunningTime="2026-04-24 21:32:29.679474791 +0000 UTC m=+234.221426392" Apr 24 21:32:35.070343 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:35.070307 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:35.070816 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:35.070393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:35.075029 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:35.075001 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:35.652219 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:35.652188 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:32:35.699195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:35.699169 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:32:37.462822 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:37.462782 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:32:40.066980 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:40.066951 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:40.071086 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:40.071062 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-68995464cf-q4lvq" Apr 24 21:32:47.680925 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:47.680889 2567 generic.go:358] "Generic (PLEG): container finished" podID="32477472-4713-476c-ac5d-4d2a735ad4b7" containerID="48b2458b3158ec82c8b2f168a11e30c39bfb6d8aa7f2e81d5434244b49192f7a" exitCode=0 Apr 24 21:32:47.681337 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:47.680963 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-crpwz" event={"ID":"32477472-4713-476c-ac5d-4d2a735ad4b7","Type":"ContainerDied","Data":"48b2458b3158ec82c8b2f168a11e30c39bfb6d8aa7f2e81d5434244b49192f7a"} Apr 24 21:32:47.681337 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:47.681320 2567 scope.go:117] "RemoveContainer" containerID="48b2458b3158ec82c8b2f168a11e30c39bfb6d8aa7f2e81d5434244b49192f7a" Apr 24 21:32:47.866308 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:47.866277 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:32:47.868471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:47.868449 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58-metrics-certs\") pod \"network-metrics-daemon-fnmhx\" (UID: \"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58\") " pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:32:48.080968 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:48.080898 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-flqk8\"" Apr 24 21:32:48.088882 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:48.088861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fnmhx" Apr 24 21:32:48.207268 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:48.207239 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fnmhx"] Apr 24 21:32:48.685688 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:48.685652 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-crpwz" event={"ID":"32477472-4713-476c-ac5d-4d2a735ad4b7","Type":"ContainerStarted","Data":"3798bf8c79b97087b0e46e454191d770c5a256c4617e09b770d0335604417eb1"} Apr 24 21:32:48.686667 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:48.686643 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fnmhx" event={"ID":"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58","Type":"ContainerStarted","Data":"fce4153819c04d742e01d3357885fdd5b236647d950e781f4eff4714acd5e476"} Apr 24 21:32:50.694179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:50.694138 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fnmhx" event={"ID":"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58","Type":"ContainerStarted","Data":"d9fab6841615662e60cb199541b55f021ddacbc0158587cb9edbcb727dbb9e57"} Apr 24 21:32:50.694656 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:50.694187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fnmhx" event={"ID":"b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58","Type":"ContainerStarted","Data":"afab6fa473d618b1f46d511edb6ffaf2b1043811c2362184f7295f4da7587aa3"} Apr 24 21:32:50.710848 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:32:50.710792 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fnmhx" podStartSLOduration=253.060075907 podStartE2EDuration="4m14.710774055s" podCreationTimestamp="2026-04-24 21:28:36 +0000 UTC" firstStartedPulling="2026-04-24 21:32:48.214261943 +0000 UTC m=+252.756213521" lastFinishedPulling="2026-04-24 21:32:49.864960093 +0000 UTC m=+254.406911669" observedRunningTime="2026-04-24 21:32:50.70882882 +0000 UTC m=+255.250780418" watchObservedRunningTime="2026-04-24 21:32:50.710774055 +0000 UTC m=+255.252725654" Apr 24 21:33:00.718575 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:00.718474 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7645c6b5bf-bjxb9" podUID="86788a9f-dce9-4c17-bc5a-212515744e3a" containerName="console" containerID="cri-o://c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561" gracePeriod=15 Apr 24 21:33:01.040550 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.037922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7645c6b5bf-bjxb9_86788a9f-dce9-4c17-bc5a-212515744e3a/console/0.log" Apr 24 21:33:01.040550 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.038005 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:33:01.069789 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069757 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.069933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069827 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.069933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069884 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.070020 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069934 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.070020 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069966 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.070020 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.069988 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr67s\" (UniqueName: \"kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s\") pod \"86788a9f-dce9-4c17-bc5a-212515744e3a\" (UID: \"86788a9f-dce9-4c17-bc5a-212515744e3a\") " Apr 24 21:33:01.070512 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.070443 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca" (OuterVolumeSpecName: "service-ca") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:01.070794 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.070744 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-service-ca\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.071269 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.070993 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:01.072626 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.071498 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config" (OuterVolumeSpecName: "console-config") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:33:01.077978 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.077937 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s" (OuterVolumeSpecName: "kube-api-access-vr67s") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "kube-api-access-vr67s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:01.078619 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.078567 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:01.078734 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.078619 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "86788a9f-dce9-4c17-bc5a-212515744e3a" (UID: "86788a9f-dce9-4c17-bc5a-212515744e3a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:01.171554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.171511 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.171554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.171551 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/86788a9f-dce9-4c17-bc5a-212515744e3a-console-oauth-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.171554 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.171562 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-oauth-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.171845 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.171571 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/86788a9f-dce9-4c17-bc5a-212515744e3a-console-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.171845 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.171581 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr67s\" (UniqueName: \"kubernetes.io/projected/86788a9f-dce9-4c17-bc5a-212515744e3a-kube-api-access-vr67s\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:33:01.726833 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.726796 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7645c6b5bf-bjxb9_86788a9f-dce9-4c17-bc5a-212515744e3a/console/0.log" Apr 24 21:33:01.727281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.726850 2567 generic.go:358] "Generic (PLEG): container finished" podID="86788a9f-dce9-4c17-bc5a-212515744e3a" containerID="c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561" exitCode=2 Apr 24 21:33:01.727281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.726909 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7645c6b5bf-bjxb9" event={"ID":"86788a9f-dce9-4c17-bc5a-212515744e3a","Type":"ContainerDied","Data":"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561"} Apr 24 21:33:01.727281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.726942 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7645c6b5bf-bjxb9" event={"ID":"86788a9f-dce9-4c17-bc5a-212515744e3a","Type":"ContainerDied","Data":"68b4299431eeacb12916e56c739b3d8634c651284f436b6b2e57d9e3f94bf526"} Apr 24 21:33:01.727281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.726964 2567 scope.go:117] "RemoveContainer" containerID="c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561" Apr 24 21:33:01.727281 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.727167 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7645c6b5bf-bjxb9" Apr 24 21:33:01.741743 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.741721 2567 scope.go:117] "RemoveContainer" containerID="c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561" Apr 24 21:33:01.742015 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:33:01.741992 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561\": container with ID starting with c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561 not found: ID does not exist" containerID="c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561" Apr 24 21:33:01.742101 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.742026 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561"} err="failed to get container status \"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561\": rpc error: code = NotFound desc = could not find container \"c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561\": container with ID starting with c146bf1ceace1c4f4cc96014e29ee5d667b35ff438ce046304ad6c114a4b0561 not found: ID does not exist" Apr 24 21:33:01.753805 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.753778 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:33:01.758580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.758555 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7645c6b5bf-bjxb9"] Apr 24 21:33:01.981952 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:01.981880 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86788a9f-dce9-4c17-bc5a-212515744e3a" path="/var/lib/kubelet/pods/86788a9f-dce9-4c17-bc5a-212515744e3a/volumes" Apr 24 21:33:14.352945 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:33:14.352903 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rrff5" podUID="3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f" Apr 24 21:33:14.352945 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:33:14.352904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" podUID="6b55e3f0-b53c-4afe-88de-6b0ada988fc9" Apr 24 21:33:14.352945 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:33:14.352904 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5tp2b" podUID="6a5a230a-7dee-4c4a-882c-d0cb1e017e43" Apr 24 21:33:14.765078 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:14.764996 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:33:14.765687 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:14.765312 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:14.765941 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:14.765344 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:33:18.207726 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.207686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:33:18.208220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.207732 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:18.208220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.207783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:33:18.210263 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.210237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6b55e3f0-b53c-4afe-88de-6b0ada988fc9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f8hxj\" (UID: \"6b55e3f0-b53c-4afe-88de-6b0ada988fc9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:33:18.210376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.210330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a5a230a-7dee-4c4a-882c-d0cb1e017e43-metrics-tls\") pod \"dns-default-5tp2b\" (UID: \"6a5a230a-7dee-4c4a-882c-d0cb1e017e43\") " pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:18.210376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.210356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f-cert\") pod \"ingress-canary-rrff5\" (UID: \"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f\") " pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:33:18.368032 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.368005 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-54st7\"" Apr 24 21:33:18.368726 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.368711 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kxf55\"" Apr 24 21:33:18.369002 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.368988 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-sbjvw\"" Apr 24 21:33:18.376495 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.376480 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" Apr 24 21:33:18.376784 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.376772 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rrff5" Apr 24 21:33:18.376847 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.376831 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:18.546780 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.546686 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5tp2b"] Apr 24 21:33:18.549617 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:33:18.549575 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5a230a_7dee_4c4a_882c_d0cb1e017e43.slice/crio-e337e152d9ccdea8abf225f61b1bcde83c1b9f7e77c9eee0c4deaa88a61f623f WatchSource:0}: Error finding container e337e152d9ccdea8abf225f61b1bcde83c1b9f7e77c9eee0c4deaa88a61f623f: Status 404 returned error can't find the container with id e337e152d9ccdea8abf225f61b1bcde83c1b9f7e77c9eee0c4deaa88a61f623f Apr 24 21:33:18.764333 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.764312 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj"] Apr 24 21:33:18.765553 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.765516 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rrff5"] Apr 24 21:33:18.766725 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:33:18.766703 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2d31d7_10c7_40a8_ba71_bb0eb1d00f2f.slice/crio-a4adce42b28eb89090fd58dabcfe0ca523c2d5b8c5dbe42f7f67dba836bd9947 WatchSource:0}: Error finding container a4adce42b28eb89090fd58dabcfe0ca523c2d5b8c5dbe42f7f67dba836bd9947: Status 404 returned error can't find the container with id a4adce42b28eb89090fd58dabcfe0ca523c2d5b8c5dbe42f7f67dba836bd9947 Apr 24 21:33:18.767351 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:33:18.767332 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b55e3f0_b53c_4afe_88de_6b0ada988fc9.slice/crio-8c3388bde0562994257ad4294f3d087ed95d41494415f02c41f5fb0c354efc99 WatchSource:0}: Error finding container 8c3388bde0562994257ad4294f3d087ed95d41494415f02c41f5fb0c354efc99: Status 404 returned error can't find the container with id 8c3388bde0562994257ad4294f3d087ed95d41494415f02c41f5fb0c354efc99 Apr 24 21:33:18.777429 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.777406 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5tp2b" event={"ID":"6a5a230a-7dee-4c4a-882c-d0cb1e017e43","Type":"ContainerStarted","Data":"e337e152d9ccdea8abf225f61b1bcde83c1b9f7e77c9eee0c4deaa88a61f623f"} Apr 24 21:33:18.778327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.778307 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" event={"ID":"6b55e3f0-b53c-4afe-88de-6b0ada988fc9","Type":"ContainerStarted","Data":"8c3388bde0562994257ad4294f3d087ed95d41494415f02c41f5fb0c354efc99"} Apr 24 21:33:18.779229 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:18.779191 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rrff5" event={"ID":"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f","Type":"ContainerStarted","Data":"a4adce42b28eb89090fd58dabcfe0ca523c2d5b8c5dbe42f7f67dba836bd9947"} Apr 24 21:33:21.789666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.789627 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" event={"ID":"6b55e3f0-b53c-4afe-88de-6b0ada988fc9","Type":"ContainerStarted","Data":"40c317ccdfa4cd53689e7fbff69a0243dd19544c9e3b91df25a9bcb7da2025b3"} Apr 24 21:33:21.790992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.790964 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rrff5" event={"ID":"3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f","Type":"ContainerStarted","Data":"742c680e7eb82f1b1c2cadafd5dedf1e903230c8b8b34fb0d6fd71f54b965fd7"} Apr 24 21:33:21.792393 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.792370 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5tp2b" event={"ID":"6a5a230a-7dee-4c4a-882c-d0cb1e017e43","Type":"ContainerStarted","Data":"408639e4fc9fc7dcf562e21deb926eacb0bda3d313d71e1e9793e60a1fa9ebf4"} Apr 24 21:33:21.792483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.792399 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5tp2b" event={"ID":"6a5a230a-7dee-4c4a-882c-d0cb1e017e43","Type":"ContainerStarted","Data":"78367b9edefcffffad76ac5f510371209dfdcc12213ca3f31257e053482badf0"} Apr 24 21:33:21.792579 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.792564 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:21.808401 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.808354 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f8hxj" podStartSLOduration=281.724366257 podStartE2EDuration="4m43.808343315s" podCreationTimestamp="2026-04-24 21:28:38 +0000 UTC" firstStartedPulling="2026-04-24 21:33:18.769200096 +0000 UTC m=+283.311151673" lastFinishedPulling="2026-04-24 21:33:20.853177151 +0000 UTC m=+285.395128731" observedRunningTime="2026-04-24 21:33:21.807624788 +0000 UTC m=+286.349576390" watchObservedRunningTime="2026-04-24 21:33:21.808343315 +0000 UTC m=+286.350294913" Apr 24 21:33:21.836225 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.836183 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5tp2b" podStartSLOduration=251.533926488 podStartE2EDuration="4m13.836170254s" podCreationTimestamp="2026-04-24 21:29:08 +0000 UTC" firstStartedPulling="2026-04-24 21:33:18.551384899 +0000 UTC m=+283.093336479" lastFinishedPulling="2026-04-24 21:33:20.853628661 +0000 UTC m=+285.395580245" observedRunningTime="2026-04-24 21:33:21.83393489 +0000 UTC m=+286.375886489" watchObservedRunningTime="2026-04-24 21:33:21.836170254 +0000 UTC m=+286.378121853" Apr 24 21:33:21.864143 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:21.864104 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rrff5" podStartSLOduration=251.775185762 podStartE2EDuration="4m13.864093122s" podCreationTimestamp="2026-04-24 21:29:08 +0000 UTC" firstStartedPulling="2026-04-24 21:33:18.768610606 +0000 UTC m=+283.310562183" lastFinishedPulling="2026-04-24 21:33:20.857517966 +0000 UTC m=+285.399469543" observedRunningTime="2026-04-24 21:33:21.863113683 +0000 UTC m=+286.405065292" watchObservedRunningTime="2026-04-24 21:33:21.864093122 +0000 UTC m=+286.406044721" Apr 24 21:33:24.218378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.218346 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:33:24.218808 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.218742 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86788a9f-dce9-4c17-bc5a-212515744e3a" containerName="console" Apr 24 21:33:24.218808 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.218758 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="86788a9f-dce9-4c17-bc5a-212515744e3a" containerName="console" Apr 24 21:33:24.218808 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.218802 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="86788a9f-dce9-4c17-bc5a-212515744e3a" containerName="console" Apr 24 21:33:24.221957 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.221938 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.234583 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.232133 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:33:24.259321 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259333 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259355 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259378 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzj7w\" (UniqueName: \"kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259438 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.259652 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.259467 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360443 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360569 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360569 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360481 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360687 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360687 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360778 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360692 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.360778 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.360719 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzj7w\" (UniqueName: \"kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.361198 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.361174 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.361334 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.361314 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.361393 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.361325 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.361552 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.361514 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.363562 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.363540 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.363659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.363545 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.372236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.372214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzj7w\" (UniqueName: \"kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w\") pod \"console-8684b9c47b-kcx4f\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.536933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.536872 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:24.662853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.662821 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:33:24.666132 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:33:24.666110 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34ea9c87_1756_413e_a9c1_738759f0a080.slice/crio-bdbdfc9fcfd3bd819afbcb05efa68359aac80ab26a77c7eda6e5476c1cc27f03 WatchSource:0}: Error finding container bdbdfc9fcfd3bd819afbcb05efa68359aac80ab26a77c7eda6e5476c1cc27f03: Status 404 returned error can't find the container with id bdbdfc9fcfd3bd819afbcb05efa68359aac80ab26a77c7eda6e5476c1cc27f03 Apr 24 21:33:24.803196 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.803128 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8684b9c47b-kcx4f" event={"ID":"34ea9c87-1756-413e-a9c1-738759f0a080","Type":"ContainerStarted","Data":"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c"} Apr 24 21:33:24.803196 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.803164 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8684b9c47b-kcx4f" event={"ID":"34ea9c87-1756-413e-a9c1-738759f0a080","Type":"ContainerStarted","Data":"bdbdfc9fcfd3bd819afbcb05efa68359aac80ab26a77c7eda6e5476c1cc27f03"} Apr 24 21:33:24.822069 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:24.822015 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8684b9c47b-kcx4f" podStartSLOduration=0.821996007 podStartE2EDuration="821.996007ms" podCreationTimestamp="2026-04-24 21:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:33:24.820801978 +0000 UTC m=+289.362753577" watchObservedRunningTime="2026-04-24 21:33:24.821996007 +0000 UTC m=+289.363947607" Apr 24 21:33:31.798381 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:31.798350 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5tp2b" Apr 24 21:33:34.537817 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:34.537781 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:34.538156 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:34.537829 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:34.542558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:34.542514 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:34.839567 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:34.839472 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:33:34.889630 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:34.889598 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:33:35.898166 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:35.898137 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:33:35.898604 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:35.898587 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:33:35.902671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:35.902651 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:33:59.911028 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:33:59.910963 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64bb4544d4-lpbvc" podUID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" containerName="console" containerID="cri-o://dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e" gracePeriod=15 Apr 24 21:34:00.142704 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.142684 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64bb4544d4-lpbvc_fa409b5c-090b-4796-99cc-14e50ee7f4d5/console/0.log" Apr 24 21:34:00.142807 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.142741 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:34:00.318628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318554 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318589 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318612 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvd8h\" (UniqueName: \"kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318628 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318630 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318906 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318684 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318906 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318704 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.318906 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318734 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config\") pod \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\" (UID: \"fa409b5c-090b-4796-99cc-14e50ee7f4d5\") " Apr 24 21:34:00.319117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.318952 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config" (OuterVolumeSpecName: "console-config") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:00.319178 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.319139 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:00.319348 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.319290 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:00.319348 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.319337 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca" (OuterVolumeSpecName: "service-ca") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:34:00.320959 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.320932 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:00.321268 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.321253 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h" (OuterVolumeSpecName: "kube-api-access-xvd8h") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "kube-api-access-xvd8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:34:00.321327 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.321274 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fa409b5c-090b-4796-99cc-14e50ee7f4d5" (UID: "fa409b5c-090b-4796-99cc-14e50ee7f4d5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:34:00.419303 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419279 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-trusted-ca-bundle\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419303 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419298 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419307 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-oauth-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419317 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-service-ca\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419328 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-console-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419337 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xvd8h\" (UniqueName: \"kubernetes.io/projected/fa409b5c-090b-4796-99cc-14e50ee7f4d5-kube-api-access-xvd8h\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.419436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.419345 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa409b5c-090b-4796-99cc-14e50ee7f4d5-oauth-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:34:00.911457 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64bb4544d4-lpbvc_fa409b5c-090b-4796-99cc-14e50ee7f4d5/console/0.log" Apr 24 21:34:00.911885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911472 2567 generic.go:358] "Generic (PLEG): container finished" podID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" containerID="dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e" exitCode=2 Apr 24 21:34:00.911885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911508 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bb4544d4-lpbvc" event={"ID":"fa409b5c-090b-4796-99cc-14e50ee7f4d5","Type":"ContainerDied","Data":"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e"} Apr 24 21:34:00.911885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911565 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bb4544d4-lpbvc" Apr 24 21:34:00.911885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911572 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bb4544d4-lpbvc" event={"ID":"fa409b5c-090b-4796-99cc-14e50ee7f4d5","Type":"ContainerDied","Data":"c716ca303a1b240938b9ef9efd6a2eb37ad5c61b2d2ec4e11ecc50f4c78dccf3"} Apr 24 21:34:00.911885 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.911593 2567 scope.go:117] "RemoveContainer" containerID="dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e" Apr 24 21:34:00.920707 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.920642 2567 scope.go:117] "RemoveContainer" containerID="dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e" Apr 24 21:34:00.921188 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:34:00.921121 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e\": container with ID starting with dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e not found: ID does not exist" containerID="dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e" Apr 24 21:34:00.921188 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.921156 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e"} err="failed to get container status \"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e\": rpc error: code = NotFound desc = could not find container \"dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e\": container with ID starting with dbb6cf102b526d9d37c6e67d1f7f2cc40eb5ef7823e5e6dd8f700f464c3bee0e not found: ID does not exist" Apr 24 21:34:00.938368 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.938343 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:34:00.944893 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:00.944862 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64bb4544d4-lpbvc"] Apr 24 21:34:01.981971 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:01.981935 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" path="/var/lib/kubelet/pods/fa409b5c-090b-4796-99cc-14e50ee7f4d5/volumes" Apr 24 21:34:32.670890 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.670807 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:34:32.671428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.671206 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" containerName="console" Apr 24 21:34:32.671428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.671224 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" containerName="console" Apr 24 21:34:32.671428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.671326 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa409b5c-090b-4796-99cc-14e50ee7f4d5" containerName="console" Apr 24 21:34:32.674239 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.674220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.690377 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.690353 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:34:32.737278 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737254 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737288 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737324 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlmr\" (UniqueName: \"kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737452 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737452 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737452 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737436 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.737576 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.737455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838249 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838223 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838254 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thlmr\" (UniqueName: \"kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838326 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838373 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.838629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.838605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.839160 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.839137 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.839242 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.839164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.839242 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.839212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.839313 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.839282 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.840777 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.840752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.841395 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.841378 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.847703 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.847678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlmr\" (UniqueName: \"kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr\") pod \"console-84994c589f-vhr2k\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:32.983161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:32.983099 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:33.107729 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:33.107374 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:34:33.110208 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:34:33.110155 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf272bc9_1606_4490_bcf9_fc46efe55151.slice/crio-ef606e77d1df8a5708288ebd22548751326872e2deb954884b483f67d32543f3 WatchSource:0}: Error finding container ef606e77d1df8a5708288ebd22548751326872e2deb954884b483f67d32543f3: Status 404 returned error can't find the container with id ef606e77d1df8a5708288ebd22548751326872e2deb954884b483f67d32543f3 Apr 24 21:34:33.112867 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:33.112841 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:34.003166 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:34.003131 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84994c589f-vhr2k" event={"ID":"bf272bc9-1606-4490-bcf9-fc46efe55151","Type":"ContainerStarted","Data":"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367"} Apr 24 21:34:34.003166 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:34.003168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84994c589f-vhr2k" event={"ID":"bf272bc9-1606-4490-bcf9-fc46efe55151","Type":"ContainerStarted","Data":"ef606e77d1df8a5708288ebd22548751326872e2deb954884b483f67d32543f3"} Apr 24 21:34:34.041483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:34.041435 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84994c589f-vhr2k" podStartSLOduration=2.041418612 podStartE2EDuration="2.041418612s" podCreationTimestamp="2026-04-24 21:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:34:34.022217595 +0000 UTC m=+358.564169218" watchObservedRunningTime="2026-04-24 21:34:34.041418612 +0000 UTC m=+358.583370211" Apr 24 21:34:42.983931 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:42.983894 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:42.983931 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:42.983938 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:42.988540 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:42.988500 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:43.030646 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:43.030228 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:34:43.078889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:34:43.078859 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:35:08.098454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.098381 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8684b9c47b-kcx4f" podUID="34ea9c87-1756-413e-a9c1-738759f0a080" containerName="console" containerID="cri-o://00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c" gracePeriod=15 Apr 24 21:35:08.330435 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.330414 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8684b9c47b-kcx4f_34ea9c87-1756-413e-a9c1-738759f0a080/console/0.log" Apr 24 21:35:08.330556 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.330476 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:35:08.409256 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409229 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409377 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409290 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409377 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409346 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzj7w\" (UniqueName: \"kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409377 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409366 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409506 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409388 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409506 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409408 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409506 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409455 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config\") pod \"34ea9c87-1756-413e-a9c1-738759f0a080\" (UID: \"34ea9c87-1756-413e-a9c1-738759f0a080\") " Apr 24 21:35:08.409685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409658 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:08.409761 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409744 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-trusted-ca-bundle\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.409821 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409786 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca" (OuterVolumeSpecName: "service-ca") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:08.409883 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409826 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:08.409946 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.409936 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config" (OuterVolumeSpecName: "console-config") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:35:08.411598 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.411574 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:08.411957 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.411939 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:35:08.412009 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.411961 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w" (OuterVolumeSpecName: "kube-api-access-lzj7w") pod "34ea9c87-1756-413e-a9c1-738759f0a080" (UID: "34ea9c87-1756-413e-a9c1-738759f0a080"). InnerVolumeSpecName "kube-api-access-lzj7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:08.510350 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510329 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-oauth-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.510350 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510349 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.510477 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510360 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-console-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.510477 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510369 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34ea9c87-1756-413e-a9c1-738759f0a080-console-oauth-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.510477 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510377 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzj7w\" (UniqueName: \"kubernetes.io/projected/34ea9c87-1756-413e-a9c1-738759f0a080-kube-api-access-lzj7w\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:08.510477 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:08.510386 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34ea9c87-1756-413e-a9c1-738759f0a080-service-ca\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:35:09.105694 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105669 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8684b9c47b-kcx4f_34ea9c87-1756-413e-a9c1-738759f0a080/console/0.log" Apr 24 21:35:09.106108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105709 2567 generic.go:358] "Generic (PLEG): container finished" podID="34ea9c87-1756-413e-a9c1-738759f0a080" containerID="00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c" exitCode=2 Apr 24 21:35:09.106108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105736 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8684b9c47b-kcx4f" event={"ID":"34ea9c87-1756-413e-a9c1-738759f0a080","Type":"ContainerDied","Data":"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c"} Apr 24 21:35:09.106108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105759 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8684b9c47b-kcx4f" event={"ID":"34ea9c87-1756-413e-a9c1-738759f0a080","Type":"ContainerDied","Data":"bdbdfc9fcfd3bd819afbcb05efa68359aac80ab26a77c7eda6e5476c1cc27f03"} Apr 24 21:35:09.106108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105774 2567 scope.go:117] "RemoveContainer" containerID="00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c" Apr 24 21:35:09.106108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.105779 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8684b9c47b-kcx4f" Apr 24 21:35:09.114121 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.114099 2567 scope.go:117] "RemoveContainer" containerID="00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c" Apr 24 21:35:09.114371 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:35:09.114349 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c\": container with ID starting with 00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c not found: ID does not exist" containerID="00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c" Apr 24 21:35:09.114500 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.114377 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c"} err="failed to get container status \"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c\": rpc error: code = NotFound desc = could not find container \"00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c\": container with ID starting with 00aa4e03b51bfebdcedc52d189b497b659943ba62b2c1a9309d6c4eb58b61a9c not found: ID does not exist" Apr 24 21:35:09.128745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.128717 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:35:09.133777 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.133758 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8684b9c47b-kcx4f"] Apr 24 21:35:09.981997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:35:09.981964 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ea9c87-1756-413e-a9c1-738759f0a080" path="/var/lib/kubelet/pods/34ea9c87-1756-413e-a9c1-738759f0a080/volumes" Apr 24 21:36:45.864410 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.864368 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9"] Apr 24 21:36:45.864880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.864661 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34ea9c87-1756-413e-a9c1-738759f0a080" containerName="console" Apr 24 21:36:45.864880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.864673 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ea9c87-1756-413e-a9c1-738759f0a080" containerName="console" Apr 24 21:36:45.864880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.864724 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="34ea9c87-1756-413e-a9c1-738759f0a080" containerName="console" Apr 24 21:36:45.867658 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.867640 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.869616 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.869592 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:36:45.870292 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.870273 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:36:45.870404 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.870292 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mpzkf\"" Apr 24 21:36:45.876361 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.876341 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9"] Apr 24 21:36:45.888631 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.888609 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.888729 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.888654 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9826z\" (UniqueName: \"kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.888729 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.888694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.989767 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.989744 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.989871 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.989786 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9826z\" (UniqueName: \"kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.989929 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.989912 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.990092 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.990076 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:45.990163 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:45.990149 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:46.000715 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:46.000695 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9826z\" (UniqueName: \"kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:46.177031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:46.176973 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:36:46.298021 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:46.297998 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9"] Apr 24 21:36:46.300048 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:36:46.300019 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f2f1c04_90ba_4057_9563_2e8079747c32.slice/crio-041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0 WatchSource:0}: Error finding container 041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0: Status 404 returned error can't find the container with id 041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0 Apr 24 21:36:46.382902 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:46.382875 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" event={"ID":"8f2f1c04-90ba-4057-9563-2e8079747c32","Type":"ContainerStarted","Data":"041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0"} Apr 24 21:36:53.403739 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:53.403697 2567 generic.go:358] "Generic (PLEG): container finished" podID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerID="5e809f7d95ada1d329a54cb9ade0f29312cefaecda406b3c1e72f176efd904f1" exitCode=0 Apr 24 21:36:53.404147 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:53.403759 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" event={"ID":"8f2f1c04-90ba-4057-9563-2e8079747c32","Type":"ContainerDied","Data":"5e809f7d95ada1d329a54cb9ade0f29312cefaecda406b3c1e72f176efd904f1"} Apr 24 21:36:56.414173 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:56.414091 2567 generic.go:358] "Generic (PLEG): container finished" podID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerID="e30c5829366712ccd974f8303894aaadf826fb3642e4c2145661d688c34ca4fe" exitCode=0 Apr 24 21:36:56.414173 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:36:56.414136 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" event={"ID":"8f2f1c04-90ba-4057-9563-2e8079747c32","Type":"ContainerDied","Data":"e30c5829366712ccd974f8303894aaadf826fb3642e4c2145661d688c34ca4fe"} Apr 24 21:37:03.435089 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:03.435052 2567 generic.go:358] "Generic (PLEG): container finished" podID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerID="c5f95aebd72f377b08f202b200e009a50aa9faa5729642b9b517452dc2e6fbbc" exitCode=0 Apr 24 21:37:03.435460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:03.435124 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" event={"ID":"8f2f1c04-90ba-4057-9563-2e8079747c32","Type":"ContainerDied","Data":"c5f95aebd72f377b08f202b200e009a50aa9faa5729642b9b517452dc2e6fbbc"} Apr 24 21:37:04.553880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.553853 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:37:04.636605 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.636580 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util\") pod \"8f2f1c04-90ba-4057-9563-2e8079747c32\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " Apr 24 21:37:04.636746 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.636646 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9826z\" (UniqueName: \"kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z\") pod \"8f2f1c04-90ba-4057-9563-2e8079747c32\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " Apr 24 21:37:04.636746 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.636675 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle\") pod \"8f2f1c04-90ba-4057-9563-2e8079747c32\" (UID: \"8f2f1c04-90ba-4057-9563-2e8079747c32\") " Apr 24 21:37:04.637587 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.637562 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle" (OuterVolumeSpecName: "bundle") pod "8f2f1c04-90ba-4057-9563-2e8079747c32" (UID: "8f2f1c04-90ba-4057-9563-2e8079747c32"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.639207 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.639185 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z" (OuterVolumeSpecName: "kube-api-access-9826z") pod "8f2f1c04-90ba-4057-9563-2e8079747c32" (UID: "8f2f1c04-90ba-4057-9563-2e8079747c32"). InnerVolumeSpecName "kube-api-access-9826z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:04.640911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.640888 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util" (OuterVolumeSpecName: "util") pod "8f2f1c04-90ba-4057-9563-2e8079747c32" (UID: "8f2f1c04-90ba-4057-9563-2e8079747c32"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:37:04.737731 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.737660 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9826z\" (UniqueName: \"kubernetes.io/projected/8f2f1c04-90ba-4057-9563-2e8079747c32-kube-api-access-9826z\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.737731 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.737687 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-bundle\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:37:04.737731 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:04.737700 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f2f1c04-90ba-4057-9563-2e8079747c32-util\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:37:05.442544 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:05.442487 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" event={"ID":"8f2f1c04-90ba-4057-9563-2e8079747c32","Type":"ContainerDied","Data":"041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0"} Apr 24 21:37:05.442544 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:05.442544 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="041a868682574af192943a3c0d2d7f1a85454aeba0e4d01bcf11baa4cc4769c0" Apr 24 21:37:05.442803 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:05.442499 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csdbb9" Apr 24 21:37:07.859392 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859341 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46"] Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859669 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="extract" Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859682 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="extract" Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859698 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="util" Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859704 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="util" Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859710 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="pull" Apr 24 21:37:07.859760 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859716 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="pull" Apr 24 21:37:07.859952 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.859770 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f2f1c04-90ba-4057-9563-2e8079747c32" containerName="extract" Apr 24 21:37:07.863115 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.863099 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:07.866853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.866830 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-jmdc6\"" Apr 24 21:37:07.875160 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.875139 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:37:07.875242 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.875160 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:37:07.875242 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.875141 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:37:07.888568 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.888544 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46"] Apr 24 21:37:07.957016 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.956991 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:07.957114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:07.957038 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdg6\" (UniqueName: \"kubernetes.io/projected/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-kube-api-access-bjdg6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.058288 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.058264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.058424 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.058309 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdg6\" (UniqueName: \"kubernetes.io/projected/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-kube-api-access-bjdg6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.060542 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.060507 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.067805 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.067781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdg6\" (UniqueName: \"kubernetes.io/projected/92e5593c-4b37-4ba0-b8e4-f4e36dc059e6-kube-api-access-bjdg6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-f7f46\" (UID: \"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.172886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.172839 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:08.292228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.292179 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46"] Apr 24 21:37:08.295932 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:37:08.295906 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e5593c_4b37_4ba0_b8e4_f4e36dc059e6.slice/crio-5e55df59161bd58923ebedda3108db13b05d936d1576e20917304244a35d58c6 WatchSource:0}: Error finding container 5e55df59161bd58923ebedda3108db13b05d936d1576e20917304244a35d58c6: Status 404 returned error can't find the container with id 5e55df59161bd58923ebedda3108db13b05d936d1576e20917304244a35d58c6 Apr 24 21:37:08.452841 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:08.452760 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" event={"ID":"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6","Type":"ContainerStarted","Data":"5e55df59161bd58923ebedda3108db13b05d936d1576e20917304244a35d58c6"} Apr 24 21:37:11.465168 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.465131 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" event={"ID":"92e5593c-4b37-4ba0-b8e4-f4e36dc059e6","Type":"ContainerStarted","Data":"67cd25a6518450dfa9cbfc8b5b948ded73698e601762c97bfc7e00b144afdbc2"} Apr 24 21:37:11.465599 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.465248 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:11.486654 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.486610 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" podStartSLOduration=1.447179185 podStartE2EDuration="4.486596538s" podCreationTimestamp="2026-04-24 21:37:07 +0000 UTC" firstStartedPulling="2026-04-24 21:37:08.297666896 +0000 UTC m=+512.839618473" lastFinishedPulling="2026-04-24 21:37:11.33708425 +0000 UTC m=+515.879035826" observedRunningTime="2026-04-24 21:37:11.485640133 +0000 UTC m=+516.027591756" watchObservedRunningTime="2026-04-24 21:37:11.486596538 +0000 UTC m=+516.028548131" Apr 24 21:37:11.839318 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.839290 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pqh5b"] Apr 24 21:37:11.845654 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.845628 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:11.848406 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.848194 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-twqh4\"" Apr 24 21:37:11.848587 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.848505 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:37:11.848854 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.848835 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:37:11.850717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.850693 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pqh5b"] Apr 24 21:37:11.990869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.990837 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:11.990992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.990890 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ade10e20-2ca9-4c92-aaa2-072c2803101b-cabundle0\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:11.990992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:11.990969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bgjl\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-kube-api-access-4bgjl\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.091364 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.091289 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ade10e20-2ca9-4c92-aaa2-072c2803101b-cabundle0\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.091364 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.091349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bgjl\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-kube-api-access-4bgjl\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.091581 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.091413 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.091581 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.091551 2567 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:37:12.091581 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.091571 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:37:12.091581 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.091581 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:37:12.091768 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.091596 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pqh5b: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:37:12.091768 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.091669 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates podName:ade10e20-2ca9-4c92-aaa2-072c2803101b nodeName:}" failed. No retries permitted until 2026-04-24 21:37:12.591646748 +0000 UTC m=+517.133598341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates") pod "keda-operator-ffbb595cb-pqh5b" (UID: "ade10e20-2ca9-4c92-aaa2-072c2803101b") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:37:12.091931 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.091911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/ade10e20-2ca9-4c92-aaa2-072c2803101b-cabundle0\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.099992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.099968 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bgjl\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-kube-api-access-4bgjl\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.190206 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.190175 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr"] Apr 24 21:37:12.193459 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.193440 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.195355 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.195338 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:37:12.202333 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.202306 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr"] Apr 24 21:37:12.292854 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.292832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.292979 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.292869 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5krs\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-kube-api-access-h5krs\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.292979 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.292920 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61cc06bd-e96d-4616-a322-5f800e5729f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.393684 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.393655 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61cc06bd-e96d-4616-a322-5f800e5729f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.393819 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.393704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.393819 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.393746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5krs\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-kube-api-access-h5krs\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.393891 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.393872 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:37:12.393928 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.393893 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:37:12.393928 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.393910 2567 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 21:37:12.393928 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.393927 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:37:12.394011 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.393974 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates podName:61cc06bd-e96d-4616-a322-5f800e5729f3 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:12.893958363 +0000 UTC m=+517.435909939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates") pod "keda-metrics-apiserver-7c9f485588-swcsr" (UID: "61cc06bd-e96d-4616-a322-5f800e5729f3") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 21:37:12.394082 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.394066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/61cc06bd-e96d-4616-a322-5f800e5729f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.399999 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.399978 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xglzq"] Apr 24 21:37:12.403195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.403181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.404699 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.404675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5krs\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-kube-api-access-h5krs\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.405262 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.405248 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:37:12.414610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.414590 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xglzq"] Apr 24 21:37:12.494439 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.494401 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhn9k\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-kube-api-access-nhn9k\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.494809 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.494472 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.595729 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.595696 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:12.595895 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.595784 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhn9k\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-kube-api-access-nhn9k\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.595895 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.595839 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.595974 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595900 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:37:12.595974 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595921 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:37:12.595974 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595932 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pqh5b: references non-existent secret key: ca.crt Apr 24 21:37:12.595974 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595946 2567 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:37:12.595974 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595964 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xglzq: secret "keda-admission-webhooks-certs" not found Apr 24 21:37:12.596118 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.595985 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates podName:ade10e20-2ca9-4c92-aaa2-072c2803101b nodeName:}" failed. No retries permitted until 2026-04-24 21:37:13.595967535 +0000 UTC m=+518.137919127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates") pod "keda-operator-ffbb595cb-pqh5b" (UID: "ade10e20-2ca9-4c92-aaa2-072c2803101b") : references non-existent secret key: ca.crt Apr 24 21:37:12.596118 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.596013 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates podName:87c66183-775c-4578-bbf9-8c89b3a927fc nodeName:}" failed. No retries permitted until 2026-04-24 21:37:13.096001188 +0000 UTC m=+517.637952766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates") pod "keda-admission-cf49989db-xglzq" (UID: "87c66183-775c-4578-bbf9-8c89b3a927fc") : secret "keda-admission-webhooks-certs" not found Apr 24 21:37:12.607513 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.607485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhn9k\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-kube-api-access-nhn9k\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:12.898411 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:12.898383 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:12.898586 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.898513 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:37:12.898586 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.898547 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:37:12.898586 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.898565 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr: references non-existent secret key: tls.crt Apr 24 21:37:12.898687 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:12.898615 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates podName:61cc06bd-e96d-4616-a322-5f800e5729f3 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:13.898600434 +0000 UTC m=+518.440552011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates") pod "keda-metrics-apiserver-7c9f485588-swcsr" (UID: "61cc06bd-e96d-4616-a322-5f800e5729f3") : references non-existent secret key: tls.crt Apr 24 21:37:13.100089 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.100060 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:13.102567 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.102542 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/87c66183-775c-4578-bbf9-8c89b3a927fc-certificates\") pod \"keda-admission-cf49989db-xglzq\" (UID: \"87c66183-775c-4578-bbf9-8c89b3a927fc\") " pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:13.322112 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.322038 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:13.445765 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.445733 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xglzq"] Apr 24 21:37:13.448853 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:37:13.448828 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c66183_775c_4578_bbf9_8c89b3a927fc.slice/crio-622bfe366b4d5059ef6e9833cdad5278654b095fdd63f88f3904f5dbc0875148 WatchSource:0}: Error finding container 622bfe366b4d5059ef6e9833cdad5278654b095fdd63f88f3904f5dbc0875148: Status 404 returned error can't find the container with id 622bfe366b4d5059ef6e9833cdad5278654b095fdd63f88f3904f5dbc0875148 Apr 24 21:37:13.473879 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.473854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xglzq" event={"ID":"87c66183-775c-4578-bbf9-8c89b3a927fc","Type":"ContainerStarted","Data":"622bfe366b4d5059ef6e9833cdad5278654b095fdd63f88f3904f5dbc0875148"} Apr 24 21:37:13.604405 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.604380 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:13.604837 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.604507 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:37:13.604837 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.604547 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:37:13.604837 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.604561 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pqh5b: references non-existent secret key: ca.crt Apr 24 21:37:13.604837 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.604622 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates podName:ade10e20-2ca9-4c92-aaa2-072c2803101b nodeName:}" failed. No retries permitted until 2026-04-24 21:37:15.604602357 +0000 UTC m=+520.146553949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates") pod "keda-operator-ffbb595cb-pqh5b" (UID: "ade10e20-2ca9-4c92-aaa2-072c2803101b") : references non-existent secret key: ca.crt Apr 24 21:37:13.907011 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:13.906942 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:13.907147 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.907100 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:37:13.907147 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.907121 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:37:13.907147 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.907141 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr: references non-existent secret key: tls.crt Apr 24 21:37:13.907281 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:13.907203 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates podName:61cc06bd-e96d-4616-a322-5f800e5729f3 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:15.907184315 +0000 UTC m=+520.449135906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates") pod "keda-metrics-apiserver-7c9f485588-swcsr" (UID: "61cc06bd-e96d-4616-a322-5f800e5729f3") : references non-existent secret key: tls.crt Apr 24 21:37:15.481876 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:15.481841 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xglzq" event={"ID":"87c66183-775c-4578-bbf9-8c89b3a927fc","Type":"ContainerStarted","Data":"3db3b30a0f783444b0949e819a2919567e888ff93a800d9cb36e77db2531e520"} Apr 24 21:37:15.482224 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:15.481974 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:15.497599 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:15.497546 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xglzq" podStartSLOduration=2.097373547 podStartE2EDuration="3.497506942s" podCreationTimestamp="2026-04-24 21:37:12 +0000 UTC" firstStartedPulling="2026-04-24 21:37:13.450560726 +0000 UTC m=+517.992512304" lastFinishedPulling="2026-04-24 21:37:14.850694106 +0000 UTC m=+519.392645699" observedRunningTime="2026-04-24 21:37:15.497123823 +0000 UTC m=+520.039075497" watchObservedRunningTime="2026-04-24 21:37:15.497506942 +0000 UTC m=+520.039458543" Apr 24 21:37:15.623089 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:15.623058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:15.623248 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.623167 2567 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:37:15.623248 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.623186 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:37:15.623248 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.623197 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-pqh5b: references non-existent secret key: ca.crt Apr 24 21:37:15.623378 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.623261 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates podName:ade10e20-2ca9-4c92-aaa2-072c2803101b nodeName:}" failed. No retries permitted until 2026-04-24 21:37:19.623241684 +0000 UTC m=+524.165193267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates") pod "keda-operator-ffbb595cb-pqh5b" (UID: "ade10e20-2ca9-4c92-aaa2-072c2803101b") : references non-existent secret key: ca.crt Apr 24 21:37:15.925233 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:15.925205 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:15.925402 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.925356 2567 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:37:15.925402 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.925377 2567 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:37:15.925402 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.925400 2567 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr: references non-existent secret key: tls.crt Apr 24 21:37:15.925540 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:37:15.925463 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates podName:61cc06bd-e96d-4616-a322-5f800e5729f3 nodeName:}" failed. No retries permitted until 2026-04-24 21:37:19.925444784 +0000 UTC m=+524.467396369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates") pod "keda-metrics-apiserver-7c9f485588-swcsr" (UID: "61cc06bd-e96d-4616-a322-5f800e5729f3") : references non-existent secret key: tls.crt Apr 24 21:37:19.653479 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.653446 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:19.655937 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.655909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ade10e20-2ca9-4c92-aaa2-072c2803101b-certificates\") pod \"keda-operator-ffbb595cb-pqh5b\" (UID: \"ade10e20-2ca9-4c92-aaa2-072c2803101b\") " pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:19.656681 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.656663 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:19.773227 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.773195 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-pqh5b"] Apr 24 21:37:19.775373 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:37:19.775339 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade10e20_2ca9_4c92_aaa2_072c2803101b.slice/crio-98f7b47f58e6f3d058a8237aee117e6bb1af6bdcbc926b37f4882ead1cb865aa WatchSource:0}: Error finding container 98f7b47f58e6f3d058a8237aee117e6bb1af6bdcbc926b37f4882ead1cb865aa: Status 404 returned error can't find the container with id 98f7b47f58e6f3d058a8237aee117e6bb1af6bdcbc926b37f4882ead1cb865aa Apr 24 21:37:19.955892 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.955807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:19.958343 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:19.958317 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/61cc06bd-e96d-4616-a322-5f800e5729f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-swcsr\" (UID: \"61cc06bd-e96d-4616-a322-5f800e5729f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:20.003954 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:20.003933 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:20.119359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:20.119334 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr"] Apr 24 21:37:20.121175 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:37:20.121145 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cc06bd_e96d_4616_a322_5f800e5729f3.slice/crio-94aa15e739a3cbbf33c51eeae58d7786ebf399728a06ea1ee1812883ec4e5426 WatchSource:0}: Error finding container 94aa15e739a3cbbf33c51eeae58d7786ebf399728a06ea1ee1812883ec4e5426: Status 404 returned error can't find the container with id 94aa15e739a3cbbf33c51eeae58d7786ebf399728a06ea1ee1812883ec4e5426 Apr 24 21:37:20.500508 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:20.500468 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" event={"ID":"61cc06bd-e96d-4616-a322-5f800e5729f3","Type":"ContainerStarted","Data":"94aa15e739a3cbbf33c51eeae58d7786ebf399728a06ea1ee1812883ec4e5426"} Apr 24 21:37:20.501847 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:20.501816 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" event={"ID":"ade10e20-2ca9-4c92-aaa2-072c2803101b","Type":"ContainerStarted","Data":"98f7b47f58e6f3d058a8237aee117e6bb1af6bdcbc926b37f4882ead1cb865aa"} Apr 24 21:37:24.516150 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.516056 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" event={"ID":"ade10e20-2ca9-4c92-aaa2-072c2803101b","Type":"ContainerStarted","Data":"afbf2ac7cfe8dd605bc7f756aa1c3fc20de9b75abc64ed0f983d4e7dc955bce8"} Apr 24 21:37:24.516568 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.516206 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:37:24.517387 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.517367 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" event={"ID":"61cc06bd-e96d-4616-a322-5f800e5729f3","Type":"ContainerStarted","Data":"9c0a606e38f10bdd7e949ee288a0d57f5b9dcbcb4c2c7c0cf9e8cdd00814376b"} Apr 24 21:37:24.517537 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.517510 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:24.555639 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.555598 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" podStartSLOduration=9.868935766 podStartE2EDuration="13.555587517s" podCreationTimestamp="2026-04-24 21:37:11 +0000 UTC" firstStartedPulling="2026-04-24 21:37:19.776554095 +0000 UTC m=+524.318505686" lastFinishedPulling="2026-04-24 21:37:23.46320586 +0000 UTC m=+528.005157437" observedRunningTime="2026-04-24 21:37:24.554307274 +0000 UTC m=+529.096258872" watchObservedRunningTime="2026-04-24 21:37:24.555587517 +0000 UTC m=+529.097539117" Apr 24 21:37:24.580469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:24.580420 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" podStartSLOduration=9.244962853 podStartE2EDuration="12.580409004s" podCreationTimestamp="2026-04-24 21:37:12 +0000 UTC" firstStartedPulling="2026-04-24 21:37:20.122616328 +0000 UTC m=+524.664567906" lastFinishedPulling="2026-04-24 21:37:23.458062473 +0000 UTC m=+528.000014057" observedRunningTime="2026-04-24 21:37:24.580269513 +0000 UTC m=+529.122221111" watchObservedRunningTime="2026-04-24 21:37:24.580409004 +0000 UTC m=+529.122360602" Apr 24 21:37:32.470921 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:32.470891 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-f7f46" Apr 24 21:37:35.524650 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:35.524623 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-swcsr" Apr 24 21:37:36.486659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:36.486632 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xglzq" Apr 24 21:37:45.523234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:37:45.523202 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-pqh5b" Apr 24 21:38:17.688332 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.688300 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-zfqls"] Apr 24 21:38:17.694097 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.694076 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.696940 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.696920 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vf9w4\"" Apr 24 21:38:17.697054 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.696971 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:38:17.697054 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.697010 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:38:17.697697 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.697682 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:38:17.702578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.702553 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-zfqls"] Apr 24 21:38:17.792427 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.792399 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvdmc\" (UniqueName: \"kubernetes.io/projected/186b84da-4fba-40a2-8597-bda518c452d3-kube-api-access-wvdmc\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.792563 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.792453 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/186b84da-4fba-40a2-8597-bda518c452d3-data\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.893302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.893281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/186b84da-4fba-40a2-8597-bda518c452d3-data\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.893440 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.893335 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvdmc\" (UniqueName: \"kubernetes.io/projected/186b84da-4fba-40a2-8597-bda518c452d3-kube-api-access-wvdmc\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.893653 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.893635 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/186b84da-4fba-40a2-8597-bda518c452d3-data\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:17.905898 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:17.905872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvdmc\" (UniqueName: \"kubernetes.io/projected/186b84da-4fba-40a2-8597-bda518c452d3-kube-api-access-wvdmc\") pod \"seaweedfs-86cc847c5c-zfqls\" (UID: \"186b84da-4fba-40a2-8597-bda518c452d3\") " pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:18.003854 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:18.003794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:18.130256 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:18.130229 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-zfqls"] Apr 24 21:38:18.132413 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:38:18.132386 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186b84da_4fba_40a2_8597_bda518c452d3.slice/crio-24124bcd9aa2ef520d32b1f95da7cb5b0a1130fd28e0d800aab62d4ed8b01b78 WatchSource:0}: Error finding container 24124bcd9aa2ef520d32b1f95da7cb5b0a1130fd28e0d800aab62d4ed8b01b78: Status 404 returned error can't find the container with id 24124bcd9aa2ef520d32b1f95da7cb5b0a1130fd28e0d800aab62d4ed8b01b78 Apr 24 21:38:18.691395 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:18.691355 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-zfqls" event={"ID":"186b84da-4fba-40a2-8597-bda518c452d3","Type":"ContainerStarted","Data":"24124bcd9aa2ef520d32b1f95da7cb5b0a1130fd28e0d800aab62d4ed8b01b78"} Apr 24 21:38:20.700131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:20.700100 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-zfqls" event={"ID":"186b84da-4fba-40a2-8597-bda518c452d3","Type":"ContainerStarted","Data":"55ac949b041f2e326f51ef0ed0128fc8f40251b9dd6e47548068e2cf425c4d58"} Apr 24 21:38:20.700489 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:20.700215 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:20.717306 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:20.717268 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-zfqls" podStartSLOduration=1.332346688 podStartE2EDuration="3.717235916s" podCreationTimestamp="2026-04-24 21:38:17 +0000 UTC" firstStartedPulling="2026-04-24 21:38:18.133730629 +0000 UTC m=+582.675682205" lastFinishedPulling="2026-04-24 21:38:20.518619842 +0000 UTC m=+585.060571433" observedRunningTime="2026-04-24 21:38:20.716510393 +0000 UTC m=+585.258461993" watchObservedRunningTime="2026-04-24 21:38:20.717235916 +0000 UTC m=+585.259187517" Apr 24 21:38:26.704932 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:26.704898 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-zfqls" Apr 24 21:38:28.594279 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.594244 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-556f786b98-9qsjb"] Apr 24 21:38:28.597608 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.597589 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.608879 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.608855 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556f786b98-9qsjb"] Apr 24 21:38:28.679599 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679570 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679726 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679604 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-trusted-ca-bundle\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679726 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679726 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-oauth-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679883 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-oauth-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679883 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-service-ca\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.679883 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.679844 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpvw\" (UniqueName: \"kubernetes.io/projected/d8ab2221-fd98-4f74-98bf-6a415c4994a6-kube-api-access-ztpvw\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780316 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780323 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-trusted-ca-bundle\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780370 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780393 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-oauth-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780409 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-oauth-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780428 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-service-ca\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.780692 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.780453 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztpvw\" (UniqueName: \"kubernetes.io/projected/d8ab2221-fd98-4f74-98bf-6a415c4994a6-kube-api-access-ztpvw\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.781117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.781089 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-oauth-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.781117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.781104 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-service-ca\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.781263 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.781096 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.781322 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.781291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8ab2221-fd98-4f74-98bf-6a415c4994a6-trusted-ca-bundle\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.782927 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.782898 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-serving-cert\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.783065 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.783046 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d8ab2221-fd98-4f74-98bf-6a415c4994a6-console-oauth-config\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.789803 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.789781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztpvw\" (UniqueName: \"kubernetes.io/projected/d8ab2221-fd98-4f74-98bf-6a415c4994a6-kube-api-access-ztpvw\") pod \"console-556f786b98-9qsjb\" (UID: \"d8ab2221-fd98-4f74-98bf-6a415c4994a6\") " pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:28.907206 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:28.907170 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:29.029097 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:29.028929 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556f786b98-9qsjb"] Apr 24 21:38:29.031761 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:38:29.031722 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ab2221_fd98_4f74_98bf_6a415c4994a6.slice/crio-dc3beb3e34cc8f6f6320b4de073a962e5e4d34f3765ba74a7bb5ec2e8c69a7f8 WatchSource:0}: Error finding container dc3beb3e34cc8f6f6320b4de073a962e5e4d34f3765ba74a7bb5ec2e8c69a7f8: Status 404 returned error can't find the container with id dc3beb3e34cc8f6f6320b4de073a962e5e4d34f3765ba74a7bb5ec2e8c69a7f8 Apr 24 21:38:29.733861 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:29.733824 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556f786b98-9qsjb" event={"ID":"d8ab2221-fd98-4f74-98bf-6a415c4994a6","Type":"ContainerStarted","Data":"c4fdb5efdb3e957fbc9cdb77608af93034c65baab1d7510bdbfd53e44ebee4ad"} Apr 24 21:38:29.733861 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:29.733866 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556f786b98-9qsjb" event={"ID":"d8ab2221-fd98-4f74-98bf-6a415c4994a6","Type":"ContainerStarted","Data":"dc3beb3e34cc8f6f6320b4de073a962e5e4d34f3765ba74a7bb5ec2e8c69a7f8"} Apr 24 21:38:29.754184 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:29.754138 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-556f786b98-9qsjb" podStartSLOduration=1.7541256619999999 podStartE2EDuration="1.754125662s" podCreationTimestamp="2026-04-24 21:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:38:29.753009339 +0000 UTC m=+594.294960939" watchObservedRunningTime="2026-04-24 21:38:29.754125662 +0000 UTC m=+594.296077260" Apr 24 21:38:35.921417 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:35.921383 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:38:35.921417 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:35.921413 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:38:38.907581 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:38.907550 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:38.907581 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:38.907590 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:38.912236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:38.912216 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:39.770981 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:39.770950 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-556f786b98-9qsjb" Apr 24 21:38:39.823667 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:38:39.823629 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:39:04.843496 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:04.843436 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84994c589f-vhr2k" podUID="bf272bc9-1606-4490-bcf9-fc46efe55151" containerName="console" containerID="cri-o://2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367" gracePeriod=15 Apr 24 21:39:05.079817 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.079797 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84994c589f-vhr2k_bf272bc9-1606-4490-bcf9-fc46efe55151/console/0.log" Apr 24 21:39:05.079922 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.079857 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:39:05.149869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.149847 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.149879 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.149908 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.149925 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.149950 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150207 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150097 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thlmr\" (UniqueName: \"kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150207 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150160 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config\") pod \"bf272bc9-1606-4490-bcf9-fc46efe55151\" (UID: \"bf272bc9-1606-4490-bcf9-fc46efe55151\") " Apr 24 21:39:05.150360 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150252 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca" (OuterVolumeSpecName: "service-ca") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:05.150423 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150357 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:05.150423 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150378 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:05.150551 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150469 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-oauth-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.150551 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150486 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-service-ca\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.150551 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150499 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-trusted-ca-bundle\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.150715 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.150678 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config" (OuterVolumeSpecName: "console-config") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:39:05.151989 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.151968 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:05.152229 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.152212 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:39:05.152284 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.152248 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr" (OuterVolumeSpecName: "kube-api-access-thlmr") pod "bf272bc9-1606-4490-bcf9-fc46efe55151" (UID: "bf272bc9-1606-4490-bcf9-fc46efe55151"). InnerVolumeSpecName "kube-api-access-thlmr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:05.251814 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.251793 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-serving-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.251904 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.251816 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf272bc9-1606-4490-bcf9-fc46efe55151-console-oauth-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.251904 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.251826 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thlmr\" (UniqueName: \"kubernetes.io/projected/bf272bc9-1606-4490-bcf9-fc46efe55151-kube-api-access-thlmr\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.251904 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.251836 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf272bc9-1606-4490-bcf9-fc46efe55151-console-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:39:05.852134 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852108 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84994c589f-vhr2k_bf272bc9-1606-4490-bcf9-fc46efe55151/console/0.log" Apr 24 21:39:05.852648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852147 2567 generic.go:358] "Generic (PLEG): container finished" podID="bf272bc9-1606-4490-bcf9-fc46efe55151" containerID="2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367" exitCode=2 Apr 24 21:39:05.852648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852222 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84994c589f-vhr2k" Apr 24 21:39:05.852648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852235 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84994c589f-vhr2k" event={"ID":"bf272bc9-1606-4490-bcf9-fc46efe55151","Type":"ContainerDied","Data":"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367"} Apr 24 21:39:05.852648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84994c589f-vhr2k" event={"ID":"bf272bc9-1606-4490-bcf9-fc46efe55151","Type":"ContainerDied","Data":"ef606e77d1df8a5708288ebd22548751326872e2deb954884b483f67d32543f3"} Apr 24 21:39:05.852648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.852298 2567 scope.go:117] "RemoveContainer" containerID="2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367" Apr 24 21:39:05.861075 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.861058 2567 scope.go:117] "RemoveContainer" containerID="2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367" Apr 24 21:39:05.861308 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:39:05.861291 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367\": container with ID starting with 2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367 not found: ID does not exist" containerID="2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367" Apr 24 21:39:05.861375 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.861314 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367"} err="failed to get container status \"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367\": rpc error: code = NotFound desc = could not find container \"2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367\": container with ID starting with 2fa3d9cfc55045b0405b222e9441580ff9020435e7603c9209abd912409d9367 not found: ID does not exist" Apr 24 21:39:05.878870 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.878064 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:39:05.879943 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.879925 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84994c589f-vhr2k"] Apr 24 21:39:05.982253 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:05.982224 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf272bc9-1606-4490-bcf9-fc46efe55151" path="/var/lib/kubelet/pods/bf272bc9-1606-4490-bcf9-fc46efe55151/volumes" Apr 24 21:39:26.162752 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.162717 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-n5pfz"] Apr 24 21:39:26.163100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.163044 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf272bc9-1606-4490-bcf9-fc46efe55151" containerName="console" Apr 24 21:39:26.163100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.163055 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf272bc9-1606-4490-bcf9-fc46efe55151" containerName="console" Apr 24 21:39:26.163175 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.163116 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf272bc9-1606-4490-bcf9-fc46efe55151" containerName="console" Apr 24 21:39:26.165926 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.165910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.169265 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.169240 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-26png\"" Apr 24 21:39:26.169665 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.169636 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:39:26.175916 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.175892 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-n5pfz"] Apr 24 21:39:26.207692 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.207664 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rm4\" (UniqueName: \"kubernetes.io/projected/9256e237-2f00-4a32-9878-6afad79c86d5-kube-api-access-92rm4\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.207823 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.207742 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9256e237-2f00-4a32-9878-6afad79c86d5-cert\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.308251 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.308220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9256e237-2f00-4a32-9878-6afad79c86d5-cert\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.308364 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.308264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92rm4\" (UniqueName: \"kubernetes.io/projected/9256e237-2f00-4a32-9878-6afad79c86d5-kube-api-access-92rm4\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.310724 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.310699 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9256e237-2f00-4a32-9878-6afad79c86d5-cert\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.317945 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.317921 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rm4\" (UniqueName: \"kubernetes.io/projected/9256e237-2f00-4a32-9878-6afad79c86d5-kube-api-access-92rm4\") pod \"odh-model-controller-696fc77849-n5pfz\" (UID: \"9256e237-2f00-4a32-9878-6afad79c86d5\") " pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.476581 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.476508 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:26.593183 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.593159 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-n5pfz"] Apr 24 21:39:26.595594 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:39:26.595559 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9256e237_2f00_4a32_9878_6afad79c86d5.slice/crio-96ef26c7441584934f38635e5d0ab9beab3da3db6fbbb0627a45f4306739fcd4 WatchSource:0}: Error finding container 96ef26c7441584934f38635e5d0ab9beab3da3db6fbbb0627a45f4306739fcd4: Status 404 returned error can't find the container with id 96ef26c7441584934f38635e5d0ab9beab3da3db6fbbb0627a45f4306739fcd4 Apr 24 21:39:26.920946 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:26.920914 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-n5pfz" event={"ID":"9256e237-2f00-4a32-9878-6afad79c86d5","Type":"ContainerStarted","Data":"96ef26c7441584934f38635e5d0ab9beab3da3db6fbbb0627a45f4306739fcd4"} Apr 24 21:39:29.948978 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:29.948938 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-n5pfz" event={"ID":"9256e237-2f00-4a32-9878-6afad79c86d5","Type":"ContainerStarted","Data":"8a17742cddec7715cc63d9e9af644ea6e71c31b727f62171afebb0a8b91553a4"} Apr 24 21:39:29.949359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:29.949069 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:39:29.966976 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:29.966928 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-n5pfz" podStartSLOduration=1.50191422 podStartE2EDuration="3.966914419s" podCreationTimestamp="2026-04-24 21:39:26 +0000 UTC" firstStartedPulling="2026-04-24 21:39:26.596745145 +0000 UTC m=+651.138696721" lastFinishedPulling="2026-04-24 21:39:29.06174534 +0000 UTC m=+653.603696920" observedRunningTime="2026-04-24 21:39:29.965827906 +0000 UTC m=+654.507779503" watchObservedRunningTime="2026-04-24 21:39:29.966914419 +0000 UTC m=+654.508866243" Apr 24 21:39:40.954972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:39:40.954945 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-n5pfz" Apr 24 21:40:00.964749 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.964714 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:40:00.968304 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.968287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:00.970825 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.970796 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-ae801-predictor-serving-cert\"" Apr 24 21:40:00.970956 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.970850 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:40:00.970956 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.970871 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\"" Apr 24 21:40:00.970956 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.970875 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:40:00.971098 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.970961 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-mq7qd\"" Apr 24 21:40:00.979596 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:00.979574 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:40:01.071730 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.071703 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.071889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.071750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.071889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.071804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q29f\" (UniqueName: \"kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.071889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.071825 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173148 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173317 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173165 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173317 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q29f\" (UniqueName: \"kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173317 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173251 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173577 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173554 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.173884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.173862 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.175636 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.175617 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.190318 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.190295 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q29f\" (UniqueName: \"kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f\") pod \"isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.279192 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.279136 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:01.407132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.407077 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:40:01.409265 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:40:01.409237 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e83759_d6d0_4ac8_8fd5_07b71bd7a32e.slice/crio-13ff0822a7dab3a1c896e5c010ab5a759e12010178b4f066fe1f48293eb8db2f WatchSource:0}: Error finding container 13ff0822a7dab3a1c896e5c010ab5a759e12010178b4f066fe1f48293eb8db2f: Status 404 returned error can't find the container with id 13ff0822a7dab3a1c896e5c010ab5a759e12010178b4f066fe1f48293eb8db2f Apr 24 21:40:01.411158 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:01.411142 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:40:02.060702 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:02.060666 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerStarted","Data":"13ff0822a7dab3a1c896e5c010ab5a759e12010178b4f066fe1f48293eb8db2f"} Apr 24 21:40:05.073301 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:05.073263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerStarted","Data":"cd5c0a0a9aae327a7a88f3723e9ca9961b06860566211cc16d4963c26b99e232"} Apr 24 21:40:09.086602 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:09.086565 2567 generic.go:358] "Generic (PLEG): container finished" podID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerID="cd5c0a0a9aae327a7a88f3723e9ca9961b06860566211cc16d4963c26b99e232" exitCode=0 Apr 24 21:40:09.087004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:09.086641 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerDied","Data":"cd5c0a0a9aae327a7a88f3723e9ca9961b06860566211cc16d4963c26b99e232"} Apr 24 21:40:23.146145 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:23.146110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerStarted","Data":"3e4df262628b11e18e37265098a9f7926cb9f2214672a3cef73fd353b7d51679"} Apr 24 21:40:25.157469 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:25.157429 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerStarted","Data":"841dc1b9f23cd0329abd63c04b915e1a3974579fae9052949e30dfc3bf40afa5"} Apr 24 21:40:28.169100 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:28.169059 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerStarted","Data":"481487a80ee78019fe750b9c7e9208792265779dd2045155520583aa9b174f9c"} Apr 24 21:40:28.169460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:28.169185 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:28.200747 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:28.200687 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podStartSLOduration=2.072785252 podStartE2EDuration="28.200673139s" podCreationTimestamp="2026-04-24 21:40:00 +0000 UTC" firstStartedPulling="2026-04-24 21:40:01.411263766 +0000 UTC m=+685.953215342" lastFinishedPulling="2026-04-24 21:40:27.53915165 +0000 UTC m=+712.081103229" observedRunningTime="2026-04-24 21:40:28.198507364 +0000 UTC m=+712.740458960" watchObservedRunningTime="2026-04-24 21:40:28.200673139 +0000 UTC m=+712.742624805" Apr 24 21:40:29.172505 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:29.172463 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:29.172505 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:29.172506 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:29.173862 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:29.173808 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:40:29.174464 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:29.174439 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:29.177205 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:29.177187 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:40:30.175404 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:30.175366 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:40:30.175857 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:30.175653 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:31.178707 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:31.178665 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:40:31.179072 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:31.178918 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:41.178752 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:41.178711 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:40:41.179236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:41.179147 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:40:51.179543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:51.179479 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:40:51.180019 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:40:51.179998 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:01.178641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:01.178594 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:41:01.179130 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:01.179108 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:11.178790 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:11.178746 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:41:11.179236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:11.179148 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:21.178685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:21.178646 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:41:21.179192 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:21.179169 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:31.179629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:31.179601 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:41:31.180011 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:31.179662 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:41:45.966024 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:45.965991 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:41:45.966578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:45.966374 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" containerID="cri-o://3e4df262628b11e18e37265098a9f7926cb9f2214672a3cef73fd353b7d51679" gracePeriod=30 Apr 24 21:41:45.966578 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:45.966512 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" containerID="cri-o://481487a80ee78019fe750b9c7e9208792265779dd2045155520583aa9b174f9c" gracePeriod=30 Apr 24 21:41:45.966721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:45.966608 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" containerID="cri-o://841dc1b9f23cd0329abd63c04b915e1a3974579fae9052949e30dfc3bf40afa5" gracePeriod=30 Apr 24 21:41:46.068573 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.068542 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:41:46.072619 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.072596 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.074625 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.074600 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-62689-predictor-serving-cert\"" Apr 24 21:41:46.075558 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.074864 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\"" Apr 24 21:41:46.083580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.083554 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:41:46.135871 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.135849 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:41:46.139310 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.139294 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.141394 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.141377 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\"" Apr 24 21:41:46.141474 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.141400 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-62689-predictor-serving-cert\"" Apr 24 21:41:46.150036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.150015 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:41:46.207187 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207160 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.207300 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207199 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7dn\" (UniqueName: \"kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.207300 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.207300 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207284 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.207412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.207412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.207412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207397 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f788\" (UniqueName: \"kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.207506 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.207415 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.308615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308549 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.308615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308588 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f788\" (UniqueName: \"kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.308615 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.308836 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.308836 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7dn\" (UniqueName: \"kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.308836 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308730 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.308836 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308757 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.309044 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:41:46.308908 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-serving-cert: secret "isvc-xgboost-graph-raw-62689-predictor-serving-cert" not found Apr 24 21:41:46.309044 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.308972 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.309044 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:41:46.308992 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls podName:269a8e34-b07b-4ce4-92b2-f57660f9df87 nodeName:}" failed. No retries permitted until 2026-04-24 21:41:46.808970976 +0000 UTC m=+791.350922566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls") pod "isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87") : secret "isvc-xgboost-graph-raw-62689-predictor-serving-cert" not found Apr 24 21:41:46.309044 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.309014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.309249 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.309094 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.309338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.309318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.309647 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.309626 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.311324 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.311305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.318267 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.318246 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f788\" (UniqueName: \"kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.318635 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.318605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7dn\" (UniqueName: \"kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn\") pod \"isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.387104 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.387085 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:46.441857 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.441810 2567 generic.go:358] "Generic (PLEG): container finished" podID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerID="841dc1b9f23cd0329abd63c04b915e1a3974579fae9052949e30dfc3bf40afa5" exitCode=2 Apr 24 21:41:46.442007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.441853 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerDied","Data":"841dc1b9f23cd0329abd63c04b915e1a3974579fae9052949e30dfc3bf40afa5"} Apr 24 21:41:46.717248 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.717221 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:41:46.719908 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:41:46.719880 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7636b02_9417_4eed_90f5_a4d603191037.slice/crio-275d425bb6eed8d2c558fe5668e0853a099907cea7c3df64e02ebf7619bb37f5 WatchSource:0}: Error finding container 275d425bb6eed8d2c558fe5668e0853a099907cea7c3df64e02ebf7619bb37f5: Status 404 returned error can't find the container with id 275d425bb6eed8d2c558fe5668e0853a099907cea7c3df64e02ebf7619bb37f5 Apr 24 21:41:46.813970 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:46.813942 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:46.814137 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:41:46.814115 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-serving-cert: secret "isvc-xgboost-graph-raw-62689-predictor-serving-cert" not found Apr 24 21:41:46.814202 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:41:46.814190 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls podName:269a8e34-b07b-4ce4-92b2-f57660f9df87 nodeName:}" failed. No retries permitted until 2026-04-24 21:41:47.814169346 +0000 UTC m=+792.356120928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls") pod "isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87") : secret "isvc-xgboost-graph-raw-62689-predictor-serving-cert" not found Apr 24 21:41:47.445876 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:47.445841 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerStarted","Data":"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c"} Apr 24 21:41:47.445876 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:47.445877 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerStarted","Data":"275d425bb6eed8d2c558fe5668e0853a099907cea7c3df64e02ebf7619bb37f5"} Apr 24 21:41:47.823289 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:47.823201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:47.825963 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:47.825883 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:47.949269 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:47.949243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:41:48.073091 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:48.073067 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:41:48.074286 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:41:48.074226 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod269a8e34_b07b_4ce4_92b2_f57660f9df87.slice/crio-1dbefc5a3c2c81cb5c9e72a5ce0b8018d2adf66fb6d2aa1e720fb22f92844ba3 WatchSource:0}: Error finding container 1dbefc5a3c2c81cb5c9e72a5ce0b8018d2adf66fb6d2aa1e720fb22f92844ba3: Status 404 returned error can't find the container with id 1dbefc5a3c2c81cb5c9e72a5ce0b8018d2adf66fb6d2aa1e720fb22f92844ba3 Apr 24 21:41:48.451154 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:48.451110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerStarted","Data":"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680"} Apr 24 21:41:48.451507 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:48.451165 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerStarted","Data":"1dbefc5a3c2c81cb5c9e72a5ce0b8018d2adf66fb6d2aa1e720fb22f92844ba3"} Apr 24 21:41:49.172746 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:49.172707 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:41:50.459008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:50.458979 2567 generic.go:358] "Generic (PLEG): container finished" podID="f7636b02-9417-4eed-90f5-a4d603191037" containerID="04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c" exitCode=0 Apr 24 21:41:50.459331 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:50.459048 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerDied","Data":"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c"} Apr 24 21:41:50.461294 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:50.461235 2567 generic.go:358] "Generic (PLEG): container finished" podID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerID="3e4df262628b11e18e37265098a9f7926cb9f2214672a3cef73fd353b7d51679" exitCode=0 Apr 24 21:41:50.461433 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:50.461306 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerDied","Data":"3e4df262628b11e18e37265098a9f7926cb9f2214672a3cef73fd353b7d51679"} Apr 24 21:41:51.178912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.178869 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:41:51.179234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.179210 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:41:51.466750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.466683 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerStarted","Data":"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54"} Apr 24 21:41:51.466750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.466719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerStarted","Data":"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40"} Apr 24 21:41:51.467131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.466914 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:51.485512 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:51.485465 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podStartSLOduration=5.485450462 podStartE2EDuration="5.485450462s" podCreationTimestamp="2026-04-24 21:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:41:51.484437127 +0000 UTC m=+796.026388739" watchObservedRunningTime="2026-04-24 21:41:51.485450462 +0000 UTC m=+796.027402060" Apr 24 21:41:52.471364 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:52.471330 2567 generic.go:358] "Generic (PLEG): container finished" podID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerID="249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680" exitCode=0 Apr 24 21:41:52.471942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:52.471398 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerDied","Data":"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680"} Apr 24 21:41:52.471942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:52.471931 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:52.473403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:52.473355 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:41:53.476154 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:53.475855 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:41:54.173446 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:54.173378 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:41:58.480955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:58.480925 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:41:58.481715 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:58.481683 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:41:59.172998 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:59.172958 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:41:59.173181 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:41:59.173108 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:42:01.178862 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:01.178700 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:42:01.179297 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:01.179096 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:42:04.173225 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:04.173180 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:42:08.482116 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:08.482076 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:42:09.173131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:09.173091 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:42:10.542535 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.542489 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerStarted","Data":"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff"} Apr 24 21:42:10.542947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.542545 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerStarted","Data":"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb"} Apr 24 21:42:10.542947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.542848 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:42:10.543092 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.542980 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:42:10.544294 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.544267 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:10.564126 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:10.564083 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podStartSLOduration=6.982819743 podStartE2EDuration="24.564071724s" podCreationTimestamp="2026-04-24 21:41:46 +0000 UTC" firstStartedPulling="2026-04-24 21:41:52.472920464 +0000 UTC m=+797.014872042" lastFinishedPulling="2026-04-24 21:42:10.054172443 +0000 UTC m=+814.596124023" observedRunningTime="2026-04-24 21:42:10.561664727 +0000 UTC m=+815.103616325" watchObservedRunningTime="2026-04-24 21:42:10.564071724 +0000 UTC m=+815.106023323" Apr 24 21:42:11.178790 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:11.178746 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 21:42:11.178989 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:11.178886 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:42:11.179087 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:11.179063 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:42:11.179175 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:11.179164 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:42:11.546235 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:11.546146 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:12.549759 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:12.549723 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:14.173454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:14.173416 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 21:42:16.565388 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.565341 2567 generic.go:358] "Generic (PLEG): container finished" podID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerID="481487a80ee78019fe750b9c7e9208792265779dd2045155520583aa9b174f9c" exitCode=0 Apr 24 21:42:16.565847 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.565412 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerDied","Data":"481487a80ee78019fe750b9c7e9208792265779dd2045155520583aa9b174f9c"} Apr 24 21:42:16.629015 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.628991 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:42:16.678234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678202 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\") pod \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " Apr 24 21:42:16.678376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678246 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls\") pod \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " Apr 24 21:42:16.678376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678341 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location\") pod \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " Apr 24 21:42:16.678502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678390 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q29f\" (UniqueName: \"kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f\") pod \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\" (UID: \"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e\") " Apr 24 21:42:16.678619 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678586 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config") pod "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" (UID: "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:42:16.678677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.678658 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" (UID: "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:16.680433 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.680407 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f" (OuterVolumeSpecName: "kube-api-access-7q29f") pod "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" (UID: "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e"). InnerVolumeSpecName "kube-api-access-7q29f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:16.680553 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.680442 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" (UID: "b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:16.779343 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.779312 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7q29f\" (UniqueName: \"kubernetes.io/projected/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kube-api-access-7q29f\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:42:16.779343 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.779340 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-isvc-raw-sklearn-batcher-ae801-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:42:16.779515 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.779351 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:42:16.779515 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:16.779360 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:42:17.553896 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.553862 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:42:17.554438 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.554411 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:17.571145 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.571121 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" Apr 24 21:42:17.571534 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.571120 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg" event={"ID":"b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e","Type":"ContainerDied","Data":"13ff0822a7dab3a1c896e5c010ab5a759e12010178b4f066fe1f48293eb8db2f"} Apr 24 21:42:17.571534 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.571229 2567 scope.go:117] "RemoveContainer" containerID="481487a80ee78019fe750b9c7e9208792265779dd2045155520583aa9b174f9c" Apr 24 21:42:17.582002 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.581769 2567 scope.go:117] "RemoveContainer" containerID="841dc1b9f23cd0329abd63c04b915e1a3974579fae9052949e30dfc3bf40afa5" Apr 24 21:42:17.589955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.589938 2567 scope.go:117] "RemoveContainer" containerID="3e4df262628b11e18e37265098a9f7926cb9f2214672a3cef73fd353b7d51679" Apr 24 21:42:17.596357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.596336 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:42:17.597079 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.597067 2567 scope.go:117] "RemoveContainer" containerID="cd5c0a0a9aae327a7a88f3723e9ca9961b06860566211cc16d4963c26b99e232" Apr 24 21:42:17.603618 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.603596 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-ae801-predictor-74c8899fd8-h6drg"] Apr 24 21:42:17.982603 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:17.982570 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" path="/var/lib/kubelet/pods/b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e/volumes" Apr 24 21:42:18.482191 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:18.482151 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:42:27.555076 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:27.555032 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:28.481942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:28.481903 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:42:37.554972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:37.554936 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:38.482087 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:38.482045 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:42:47.554378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:47.554335 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:48.482010 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:48.481971 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:42:57.554803 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:57.554756 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:42:58.482394 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:42:58.482368 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:43:07.555479 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:07.555449 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:43:26.287794 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.287721 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:43:26.288240 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.288054 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" containerID="cri-o://253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40" gracePeriod=30 Apr 24 21:43:26.288240 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.288116 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kube-rbac-proxy" containerID="cri-o://dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54" gracePeriod=30 Apr 24 21:43:26.350330 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350306 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:43:26.350687 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350673 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350690 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350706 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350711 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350720 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="storage-initializer" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350726 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="storage-initializer" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350744 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" Apr 24 21:43:26.350750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350749 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" Apr 24 21:43:26.351029 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350804 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kube-rbac-proxy" Apr 24 21:43:26.351029 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350816 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="kserve-container" Apr 24 21:43:26.351029 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.350824 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9e83759-d6d0-4ac8-8fd5-07b71bd7a32e" containerName="agent" Apr 24 21:43:26.354260 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.354241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.356709 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.356686 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\"" Apr 24 21:43:26.356807 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.356772 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-serving-cert\"" Apr 24 21:43:26.366467 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.366445 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:43:26.423778 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.423753 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ckx\" (UniqueName: \"kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.423886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.423807 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.423886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.423842 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.423886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.423861 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.451314 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.451273 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:43:26.451656 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.451631 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kube-rbac-proxy" containerID="cri-o://15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff" gracePeriod=30 Apr 24 21:43:26.451765 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.451627 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" containerID="cri-o://08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb" gracePeriod=30 Apr 24 21:43:26.524884 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.524856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.524997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.524915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.524997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.524935 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.524997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.524993 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85ckx\" (UniqueName: \"kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.525176 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:26.525007 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-2b4f4-predictor-serving-cert" not found Apr 24 21:43:26.525176 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:26.525075 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls podName:0c696334-2346-49ce-8d0b-8991eb2436e4 nodeName:}" failed. No retries permitted until 2026-04-24 21:43:27.025054547 +0000 UTC m=+891.567006131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" (UID: "0c696334-2346-49ce-8d0b-8991eb2436e4") : secret "isvc-sklearn-graph-raw-hpa-2b4f4-predictor-serving-cert" not found Apr 24 21:43:26.525316 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.525297 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.525629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.525611 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.534395 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.534370 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ckx\" (UniqueName: \"kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:26.640135 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.640103 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:43:26.643656 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.643637 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.645841 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.645813 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\"" Apr 24 21:43:26.645968 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.645872 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-serving-cert\"" Apr 24 21:43:26.654192 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.654170 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:43:26.726868 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.726846 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.727007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.726879 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.727007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.726929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.727132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.727019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xbs\" (UniqueName: \"kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.808092 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.808065 2567 generic.go:358] "Generic (PLEG): container finished" podID="f7636b02-9417-4eed-90f5-a4d603191037" containerID="dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54" exitCode=2 Apr 24 21:43:26.808223 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.808134 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerDied","Data":"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54"} Apr 24 21:43:26.810205 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.810183 2567 generic.go:358] "Generic (PLEG): container finished" podID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerID="15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff" exitCode=2 Apr 24 21:43:26.810298 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.810216 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerDied","Data":"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff"} Apr 24 21:43:26.827781 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.827706 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.827781 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.827752 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.827959 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.827820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.827959 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.827866 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xbs\" (UniqueName: \"kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.828443 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.828420 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.828814 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.828792 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.830482 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.830462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.838157 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.838134 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xbs\" (UniqueName: \"kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs\") pod \"isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:26.956230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:26.956208 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:27.028865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.028832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:27.031382 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.031352 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:27.081262 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.081239 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:43:27.083850 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:43:27.083821 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4560568f_042c_47ee_838e_0ade81f72494.slice/crio-03eff46daf5b987716020c0f790e4b4f46651e1d591b4a74fba391379237a7ca WatchSource:0}: Error finding container 03eff46daf5b987716020c0f790e4b4f46651e1d591b4a74fba391379237a7ca: Status 404 returned error can't find the container with id 03eff46daf5b987716020c0f790e4b4f46651e1d591b4a74fba391379237a7ca Apr 24 21:43:27.267010 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.266982 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:27.396368 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.396344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:43:27.398471 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:43:27.398435 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c696334_2346_49ce_8d0b_8991eb2436e4.slice/crio-fc1e7452f4480def64e76953b207c5f21680ee276dda9d69ffa60dd00c1f972a WatchSource:0}: Error finding container fc1e7452f4480def64e76953b207c5f21680ee276dda9d69ffa60dd00c1f972a: Status 404 returned error can't find the container with id fc1e7452f4480def64e76953b207c5f21680ee276dda9d69ffa60dd00c1f972a Apr 24 21:43:27.550256 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.550214 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 24 21:43:27.554545 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.554493 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 21:43:27.815724 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.815682 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerStarted","Data":"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8"} Apr 24 21:43:27.815992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.815734 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerStarted","Data":"fc1e7452f4480def64e76953b207c5f21680ee276dda9d69ffa60dd00c1f972a"} Apr 24 21:43:27.817148 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.817121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerStarted","Data":"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f"} Apr 24 21:43:27.817148 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:27.817150 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerStarted","Data":"03eff46daf5b987716020c0f790e4b4f46651e1d591b4a74fba391379237a7ca"} Apr 24 21:43:28.476333 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:28.476292 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 24 21:43:28.482189 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:28.482154 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 21:43:29.795851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.795828 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:43:29.826372 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.826338 2567 generic.go:358] "Generic (PLEG): container finished" podID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerID="08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb" exitCode=0 Apr 24 21:43:29.826555 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.826435 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" Apr 24 21:43:29.826555 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.826458 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerDied","Data":"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb"} Apr 24 21:43:29.826555 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.826503 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx" event={"ID":"269a8e34-b07b-4ce4-92b2-f57660f9df87","Type":"ContainerDied","Data":"1dbefc5a3c2c81cb5c9e72a5ce0b8018d2adf66fb6d2aa1e720fb22f92844ba3"} Apr 24 21:43:29.826555 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.826540 2567 scope.go:117] "RemoveContainer" containerID="15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff" Apr 24 21:43:29.836127 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.836110 2567 scope.go:117] "RemoveContainer" containerID="08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb" Apr 24 21:43:29.843147 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.843132 2567 scope.go:117] "RemoveContainer" containerID="249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680" Apr 24 21:43:29.851503 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.851487 2567 scope.go:117] "RemoveContainer" containerID="15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff" Apr 24 21:43:29.851764 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:29.851744 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff\": container with ID starting with 15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff not found: ID does not exist" containerID="15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff" Apr 24 21:43:29.851844 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.851776 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff"} err="failed to get container status \"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff\": rpc error: code = NotFound desc = could not find container \"15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff\": container with ID starting with 15e5777897e6eff68b8c8055ca8d96286d9f701f8e9514b559f4cdb45ae687ff not found: ID does not exist" Apr 24 21:43:29.851844 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.851802 2567 scope.go:117] "RemoveContainer" containerID="08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb" Apr 24 21:43:29.852064 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:29.852046 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb\": container with ID starting with 08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb not found: ID does not exist" containerID="08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb" Apr 24 21:43:29.852106 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852072 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb"} err="failed to get container status \"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb\": rpc error: code = NotFound desc = could not find container \"08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb\": container with ID starting with 08551accf03283c45f3821d1b15af04d4da074e49c69a5e645e61fe5ae222fcb not found: ID does not exist" Apr 24 21:43:29.852106 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852088 2567 scope.go:117] "RemoveContainer" containerID="249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680" Apr 24 21:43:29.852337 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:29.852316 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680\": container with ID starting with 249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680 not found: ID does not exist" containerID="249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680" Apr 24 21:43:29.852407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852353 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680"} err="failed to get container status \"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680\": rpc error: code = NotFound desc = could not find container \"249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680\": container with ID starting with 249ec86eb7dc29af7bc3dc0b605fc47bc5593bb73701b4b7f2b9ae97b64d4680 not found: ID does not exist" Apr 24 21:43:29.852407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852364 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") pod \"269a8e34-b07b-4ce4-92b2-f57660f9df87\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " Apr 24 21:43:29.852407 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852395 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location\") pod \"269a8e34-b07b-4ce4-92b2-f57660f9df87\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " Apr 24 21:43:29.852564 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852489 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f788\" (UniqueName: \"kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788\") pod \"269a8e34-b07b-4ce4-92b2-f57660f9df87\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " Apr 24 21:43:29.852610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852588 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"269a8e34-b07b-4ce4-92b2-f57660f9df87\" (UID: \"269a8e34-b07b-4ce4-92b2-f57660f9df87\") " Apr 24 21:43:29.852804 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852780 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "269a8e34-b07b-4ce4-92b2-f57660f9df87" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:29.852926 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852905 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/269a8e34-b07b-4ce4-92b2-f57660f9df87-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:29.852995 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.852965 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config") pod "269a8e34-b07b-4ce4-92b2-f57660f9df87" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87"). InnerVolumeSpecName "isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:29.854395 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.854370 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "269a8e34-b07b-4ce4-92b2-f57660f9df87" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:29.854489 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.854464 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788" (OuterVolumeSpecName: "kube-api-access-7f788") pod "269a8e34-b07b-4ce4-92b2-f57660f9df87" (UID: "269a8e34-b07b-4ce4-92b2-f57660f9df87"). InnerVolumeSpecName "kube-api-access-7f788". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:29.954073 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.954008 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7f788\" (UniqueName: \"kubernetes.io/projected/269a8e34-b07b-4ce4-92b2-f57660f9df87-kube-api-access-7f788\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:29.954073 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.954032 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/269a8e34-b07b-4ce4-92b2-f57660f9df87-isvc-xgboost-graph-raw-62689-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:29.954073 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:29.954043 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/269a8e34-b07b-4ce4-92b2-f57660f9df87-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:30.151660 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.151630 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:43:30.154065 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.154031 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-62689-predictor-bc95c6d69-pvkhx"] Apr 24 21:43:30.407675 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.407652 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:43:30.459208 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459178 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7dn\" (UniqueName: \"kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn\") pod \"f7636b02-9417-4eed-90f5-a4d603191037\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " Apr 24 21:43:30.459345 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459232 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls\") pod \"f7636b02-9417-4eed-90f5-a4d603191037\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " Apr 24 21:43:30.459345 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459293 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location\") pod \"f7636b02-9417-4eed-90f5-a4d603191037\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " Apr 24 21:43:30.459345 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459324 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\") pod \"f7636b02-9417-4eed-90f5-a4d603191037\" (UID: \"f7636b02-9417-4eed-90f5-a4d603191037\") " Apr 24 21:43:30.459674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459648 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f7636b02-9417-4eed-90f5-a4d603191037" (UID: "f7636b02-9417-4eed-90f5-a4d603191037"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:30.459762 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.459689 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config") pod "f7636b02-9417-4eed-90f5-a4d603191037" (UID: "f7636b02-9417-4eed-90f5-a4d603191037"). InnerVolumeSpecName "isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:43:30.461114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.461086 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn" (OuterVolumeSpecName: "kube-api-access-vl7dn") pod "f7636b02-9417-4eed-90f5-a4d603191037" (UID: "f7636b02-9417-4eed-90f5-a4d603191037"). InnerVolumeSpecName "kube-api-access-vl7dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:30.461199 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.461154 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f7636b02-9417-4eed-90f5-a4d603191037" (UID: "f7636b02-9417-4eed-90f5-a4d603191037"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:30.559930 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.559875 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7636b02-9417-4eed-90f5-a4d603191037-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:30.559930 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.559898 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f7636b02-9417-4eed-90f5-a4d603191037-isvc-sklearn-graph-raw-62689-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:30.559930 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.559908 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vl7dn\" (UniqueName: \"kubernetes.io/projected/f7636b02-9417-4eed-90f5-a4d603191037-kube-api-access-vl7dn\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:30.559930 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.559918 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f7636b02-9417-4eed-90f5-a4d603191037-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:43:30.831341 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.831284 2567 generic.go:358] "Generic (PLEG): container finished" podID="4560568f-042c-47ee-838e-0ade81f72494" containerID="eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f" exitCode=0 Apr 24 21:43:30.831671 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.831357 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerDied","Data":"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f"} Apr 24 21:43:30.833983 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.833958 2567 generic.go:358] "Generic (PLEG): container finished" podID="f7636b02-9417-4eed-90f5-a4d603191037" containerID="253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40" exitCode=0 Apr 24 21:43:30.834131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.834006 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerDied","Data":"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40"} Apr 24 21:43:30.834131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.834023 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" event={"ID":"f7636b02-9417-4eed-90f5-a4d603191037","Type":"ContainerDied","Data":"275d425bb6eed8d2c558fe5668e0853a099907cea7c3df64e02ebf7619bb37f5"} Apr 24 21:43:30.834131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.834040 2567 scope.go:117] "RemoveContainer" containerID="dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54" Apr 24 21:43:30.834131 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.834050 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6" Apr 24 21:43:30.866937 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.866918 2567 scope.go:117] "RemoveContainer" containerID="253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40" Apr 24 21:43:30.889987 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.889972 2567 scope.go:117] "RemoveContainer" containerID="04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c" Apr 24 21:43:30.909912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.909894 2567 scope.go:117] "RemoveContainer" containerID="dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54" Apr 24 21:43:30.910158 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:30.910138 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54\": container with ID starting with dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54 not found: ID does not exist" containerID="dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54" Apr 24 21:43:30.910241 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.910168 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54"} err="failed to get container status \"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54\": rpc error: code = NotFound desc = could not find container \"dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54\": container with ID starting with dd551b83d0c5f97f6aa4910e02bcdd168d85aa6f0c54c6e083d76a53efcc7c54 not found: ID does not exist" Apr 24 21:43:30.910241 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.910192 2567 scope.go:117] "RemoveContainer" containerID="253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40" Apr 24 21:43:30.910481 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:30.910459 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40\": container with ID starting with 253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40 not found: ID does not exist" containerID="253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40" Apr 24 21:43:30.910543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.910490 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40"} err="failed to get container status \"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40\": rpc error: code = NotFound desc = could not find container \"253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40\": container with ID starting with 253dc835411772e5d6ca613d937968119517a25673e697d139b162f6f5b97c40 not found: ID does not exist" Apr 24 21:43:30.910543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.910514 2567 scope.go:117] "RemoveContainer" containerID="04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c" Apr 24 21:43:30.910794 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:43:30.910774 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c\": container with ID starting with 04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c not found: ID does not exist" containerID="04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c" Apr 24 21:43:30.910848 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.910799 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c"} err="failed to get container status \"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c\": rpc error: code = NotFound desc = could not find container \"04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c\": container with ID starting with 04da4c6c2a7696b96148549977cdbb509340b6964f69dd029c1a4b36079cdb7c not found: ID does not exist" Apr 24 21:43:30.920370 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.920354 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:43:30.926218 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:30.926187 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-62689-predictor-6fdc5df96-kx8l6"] Apr 24 21:43:31.839075 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.839041 2567 generic.go:358] "Generic (PLEG): container finished" podID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerID="f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8" exitCode=0 Apr 24 21:43:31.839493 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.839110 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerDied","Data":"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8"} Apr 24 21:43:31.841003 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.840979 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerStarted","Data":"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4"} Apr 24 21:43:31.841115 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.841025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerStarted","Data":"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3"} Apr 24 21:43:31.841266 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.841247 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:31.874892 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.874853 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podStartSLOduration=5.874841237 podStartE2EDuration="5.874841237s" podCreationTimestamp="2026-04-24 21:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:31.873404116 +0000 UTC m=+896.415355712" watchObservedRunningTime="2026-04-24 21:43:31.874841237 +0000 UTC m=+896.416792836" Apr 24 21:43:31.982025 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.982001 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" path="/var/lib/kubelet/pods/269a8e34-b07b-4ce4-92b2-f57660f9df87/volumes" Apr 24 21:43:31.982475 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:31.982460 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7636b02-9417-4eed-90f5-a4d603191037" path="/var/lib/kubelet/pods/f7636b02-9417-4eed-90f5-a4d603191037/volumes" Apr 24 21:43:32.846161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.846128 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerStarted","Data":"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5"} Apr 24 21:43:32.846161 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.846168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerStarted","Data":"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c"} Apr 24 21:43:32.846688 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.846379 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:32.846688 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.846479 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:32.847509 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.847485 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:43:32.864825 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:32.864787 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podStartSLOduration=6.864775443 podStartE2EDuration="6.864775443s" podCreationTimestamp="2026-04-24 21:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:32.863304526 +0000 UTC m=+897.405256135" watchObservedRunningTime="2026-04-24 21:43:32.864775443 +0000 UTC m=+897.406727041" Apr 24 21:43:33.849587 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:33.849546 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:43:33.849979 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:33.849760 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:33.850997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:33.850974 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:43:34.852362 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:34.852324 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:43:35.948444 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:35.948416 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:43:35.949473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:35.949450 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:43:38.853447 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:38.853421 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:43:38.853961 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:38.853932 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:43:39.856640 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:39.856611 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:43:39.857205 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:39.857177 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:43:48.854339 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:48.854294 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:43:49.857679 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:49.857636 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:43:58.853920 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:58.853876 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:43:59.857093 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:43:59.857056 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:44:08.853909 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:08.853873 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:44:09.857921 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:09.857881 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:44:18.854104 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:18.854063 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:44:19.857339 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:19.857302 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:44:28.854677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:28.854649 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:44:29.858044 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:29.858005 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:44:39.858390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:44:39.858361 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:45:06.593055 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.593020 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:45:06.593454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.593355 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" containerID="cri-o://5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c" gracePeriod=30 Apr 24 21:45:06.593568 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.593429 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kube-rbac-proxy" containerID="cri-o://19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5" gracePeriod=30 Apr 24 21:45:06.639375 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639349 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:45:06.639722 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639710 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kube-rbac-proxy" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639723 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kube-rbac-proxy" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639739 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639744 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639756 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="storage-initializer" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639762 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="storage-initializer" Apr 24 21:45:06.639770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639768 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="storage-initializer" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639773 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="storage-initializer" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639784 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639789 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639797 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kube-rbac-proxy" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639802 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kube-rbac-proxy" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639854 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kube-rbac-proxy" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639861 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kube-rbac-proxy" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639869 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7636b02-9417-4eed-90f5-a4d603191037" containerName="kserve-container" Apr 24 21:45:06.639972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.639875 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="269a8e34-b07b-4ce4-92b2-f57660f9df87" containerName="kserve-container" Apr 24 21:45:06.643043 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.643028 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.645342 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.645254 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-fb0e6-predictor-serving-cert\"" Apr 24 21:45:06.645342 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.645309 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\"" Apr 24 21:45:06.655409 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.655387 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:45:06.686183 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.686150 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:45:06.686632 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.686603 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" containerID="cri-o://ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3" gracePeriod=30 Apr 24 21:45:06.686696 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.686671 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kube-rbac-proxy" containerID="cri-o://91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4" gracePeriod=30 Apr 24 21:45:06.707840 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.707818 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtm9d\" (UniqueName: \"kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.707955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.707864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.707955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.707900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.809211 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.809176 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtm9d\" (UniqueName: \"kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.809379 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.809243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.809379 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.809301 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.809990 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.809968 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.811798 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.811775 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.818384 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.818359 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtm9d\" (UniqueName: \"kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d\") pod \"message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:06.954288 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:06.954262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:07.074341 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.074290 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:45:07.077931 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:45:07.077901 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cfcc287_0d5d_4ba7_be93_a76c1ad74d7b.slice/crio-aa8cb9869ddadfd4c8e4b066bba76c748014a738ea98b83df94363f3253e76ac WatchSource:0}: Error finding container aa8cb9869ddadfd4c8e4b066bba76c748014a738ea98b83df94363f3253e76ac: Status 404 returned error can't find the container with id aa8cb9869ddadfd4c8e4b066bba76c748014a738ea98b83df94363f3253e76ac Apr 24 21:45:07.079636 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.079616 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:45:07.151803 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.151773 2567 generic.go:358] "Generic (PLEG): container finished" podID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerID="19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5" exitCode=2 Apr 24 21:45:07.151914 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.151839 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerDied","Data":"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5"} Apr 24 21:45:07.153573 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.153552 2567 generic.go:358] "Generic (PLEG): container finished" podID="4560568f-042c-47ee-838e-0ade81f72494" containerID="91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4" exitCode=2 Apr 24 21:45:07.153669 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.153632 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerDied","Data":"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4"} Apr 24 21:45:07.154648 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:07.154628 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerStarted","Data":"aa8cb9869ddadfd4c8e4b066bba76c748014a738ea98b83df94363f3253e76ac"} Apr 24 21:45:08.850353 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:08.850318 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 24 21:45:08.854661 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:08.854632 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 24 21:45:09.169917 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.169880 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerStarted","Data":"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d"} Apr 24 21:45:09.169917 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.169923 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerStarted","Data":"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1"} Apr 24 21:45:09.170141 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.170108 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:09.170293 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.170277 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:09.172098 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.172078 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:09.188385 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.188343 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" podStartSLOduration=2.164297811 podStartE2EDuration="3.1883324s" podCreationTimestamp="2026-04-24 21:45:06 +0000 UTC" firstStartedPulling="2026-04-24 21:45:07.079746215 +0000 UTC m=+991.621697791" lastFinishedPulling="2026-04-24 21:45:08.103780803 +0000 UTC m=+992.645732380" observedRunningTime="2026-04-24 21:45:09.186140847 +0000 UTC m=+993.728092447" watchObservedRunningTime="2026-04-24 21:45:09.1883324 +0000 UTC m=+993.730283999" Apr 24 21:45:09.852975 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.852934 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 24 21:45:09.857292 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:09.857262 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 21:45:10.122031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.122008 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:45:10.174428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.174395 2567 generic.go:358] "Generic (PLEG): container finished" podID="4560568f-042c-47ee-838e-0ade81f72494" containerID="ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3" exitCode=0 Apr 24 21:45:10.174572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.174476 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" Apr 24 21:45:10.174572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.174473 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerDied","Data":"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3"} Apr 24 21:45:10.174572 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.174565 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz" event={"ID":"4560568f-042c-47ee-838e-0ade81f72494","Type":"ContainerDied","Data":"03eff46daf5b987716020c0f790e4b4f46651e1d591b4a74fba391379237a7ca"} Apr 24 21:45:10.174691 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.174582 2567 scope.go:117] "RemoveContainer" containerID="91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4" Apr 24 21:45:10.182184 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.182165 2567 scope.go:117] "RemoveContainer" containerID="ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3" Apr 24 21:45:10.189004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.188985 2567 scope.go:117] "RemoveContainer" containerID="eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f" Apr 24 21:45:10.195861 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.195843 2567 scope.go:117] "RemoveContainer" containerID="91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4" Apr 24 21:45:10.196127 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:10.196108 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4\": container with ID starting with 91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4 not found: ID does not exist" containerID="91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4" Apr 24 21:45:10.196169 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.196137 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4"} err="failed to get container status \"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4\": rpc error: code = NotFound desc = could not find container \"91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4\": container with ID starting with 91d9ed3edb3e704d281656962eac74e75ebae169a9b2da4a7af300599ca0b0c4 not found: ID does not exist" Apr 24 21:45:10.196169 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.196154 2567 scope.go:117] "RemoveContainer" containerID="ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3" Apr 24 21:45:10.196375 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:10.196358 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3\": container with ID starting with ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3 not found: ID does not exist" containerID="ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3" Apr 24 21:45:10.196421 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.196381 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3"} err="failed to get container status \"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3\": rpc error: code = NotFound desc = could not find container \"ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3\": container with ID starting with ff4b78ed5e02a3967d2c12589912be2b5384df7840809a3a6b579c4eceec03a3 not found: ID does not exist" Apr 24 21:45:10.196421 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.196395 2567 scope.go:117] "RemoveContainer" containerID="eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f" Apr 24 21:45:10.196615 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:10.196597 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f\": container with ID starting with eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f not found: ID does not exist" containerID="eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f" Apr 24 21:45:10.196677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.196623 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f"} err="failed to get container status \"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f\": rpc error: code = NotFound desc = could not find container \"eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f\": container with ID starting with eddd905d3d3dbc838273c43b0ca38d82d6a369131b9f95b1a496057abce7b61f not found: ID does not exist" Apr 24 21:45:10.242941 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.242919 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8xbs\" (UniqueName: \"kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs\") pod \"4560568f-042c-47ee-838e-0ade81f72494\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " Apr 24 21:45:10.243028 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.242956 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"4560568f-042c-47ee-838e-0ade81f72494\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " Apr 24 21:45:10.243028 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.243019 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls\") pod \"4560568f-042c-47ee-838e-0ade81f72494\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " Apr 24 21:45:10.243101 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.243035 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location\") pod \"4560568f-042c-47ee-838e-0ade81f72494\" (UID: \"4560568f-042c-47ee-838e-0ade81f72494\") " Apr 24 21:45:10.243373 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.243353 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config") pod "4560568f-042c-47ee-838e-0ade81f72494" (UID: "4560568f-042c-47ee-838e-0ade81f72494"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:10.243438 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.243369 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4560568f-042c-47ee-838e-0ade81f72494" (UID: "4560568f-042c-47ee-838e-0ade81f72494"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:10.244890 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.244868 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs" (OuterVolumeSpecName: "kube-api-access-c8xbs") pod "4560568f-042c-47ee-838e-0ade81f72494" (UID: "4560568f-042c-47ee-838e-0ade81f72494"). InnerVolumeSpecName "kube-api-access-c8xbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:10.244987 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.244968 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4560568f-042c-47ee-838e-0ade81f72494" (UID: "4560568f-042c-47ee-838e-0ade81f72494"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:10.343635 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.343609 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8xbs\" (UniqueName: \"kubernetes.io/projected/4560568f-042c-47ee-838e-0ade81f72494-kube-api-access-c8xbs\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.343635 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.343633 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4560568f-042c-47ee-838e-0ade81f72494-isvc-xgboost-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.343781 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.343712 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4560568f-042c-47ee-838e-0ade81f72494-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.343781 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.343722 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4560568f-042c-47ee-838e-0ade81f72494-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.498123 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.498098 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:45:10.502486 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.502463 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-2b4f4-predictor-7b7f6f6765-fcrsz"] Apr 24 21:45:10.836401 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.836378 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:45:10.948093 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948060 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location\") pod \"0c696334-2346-49ce-8d0b-8991eb2436e4\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " Apr 24 21:45:10.948543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948110 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") pod \"0c696334-2346-49ce-8d0b-8991eb2436e4\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " Apr 24 21:45:10.948543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948170 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") pod \"0c696334-2346-49ce-8d0b-8991eb2436e4\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " Apr 24 21:45:10.948543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948264 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ckx\" (UniqueName: \"kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx\") pod \"0c696334-2346-49ce-8d0b-8991eb2436e4\" (UID: \"0c696334-2346-49ce-8d0b-8991eb2436e4\") " Apr 24 21:45:10.948543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948404 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0c696334-2346-49ce-8d0b-8991eb2436e4" (UID: "0c696334-2346-49ce-8d0b-8991eb2436e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:10.948543 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948447 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config") pod "0c696334-2346-49ce-8d0b-8991eb2436e4" (UID: "0c696334-2346-49ce-8d0b-8991eb2436e4"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:10.948750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948584 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0c696334-2346-49ce-8d0b-8991eb2436e4-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.948750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.948606 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c696334-2346-49ce-8d0b-8991eb2436e4-isvc-sklearn-graph-raw-hpa-2b4f4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:10.950129 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.950112 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0c696334-2346-49ce-8d0b-8991eb2436e4" (UID: "0c696334-2346-49ce-8d0b-8991eb2436e4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:10.950282 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:10.950268 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx" (OuterVolumeSpecName: "kube-api-access-85ckx") pod "0c696334-2346-49ce-8d0b-8991eb2436e4" (UID: "0c696334-2346-49ce-8d0b-8991eb2436e4"). InnerVolumeSpecName "kube-api-access-85ckx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:11.049084 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.049024 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c696334-2346-49ce-8d0b-8991eb2436e4-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:11.049084 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.049045 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85ckx\" (UniqueName: \"kubernetes.io/projected/0c696334-2346-49ce-8d0b-8991eb2436e4-kube-api-access-85ckx\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:45:11.179071 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.179040 2567 generic.go:358] "Generic (PLEG): container finished" podID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerID="5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c" exitCode=0 Apr 24 21:45:11.179201 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.179124 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" Apr 24 21:45:11.179201 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.179122 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerDied","Data":"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c"} Apr 24 21:45:11.179302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.179233 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94" event={"ID":"0c696334-2346-49ce-8d0b-8991eb2436e4","Type":"ContainerDied","Data":"fc1e7452f4480def64e76953b207c5f21680ee276dda9d69ffa60dd00c1f972a"} Apr 24 21:45:11.179302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.179268 2567 scope.go:117] "RemoveContainer" containerID="19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5" Apr 24 21:45:11.187253 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.187218 2567 scope.go:117] "RemoveContainer" containerID="5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c" Apr 24 21:45:11.193889 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.193873 2567 scope.go:117] "RemoveContainer" containerID="f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8" Apr 24 21:45:11.200851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.200828 2567 scope.go:117] "RemoveContainer" containerID="19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5" Apr 24 21:45:11.201112 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:11.201091 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5\": container with ID starting with 19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5 not found: ID does not exist" containerID="19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5" Apr 24 21:45:11.201217 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201118 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5"} err="failed to get container status \"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5\": rpc error: code = NotFound desc = could not find container \"19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5\": container with ID starting with 19cc33196a93c54d7623106585bb7734389b61b1bf4e50227666d6642c1da1f5 not found: ID does not exist" Apr 24 21:45:11.201217 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201136 2567 scope.go:117] "RemoveContainer" containerID="5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c" Apr 24 21:45:11.201389 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:11.201365 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c\": container with ID starting with 5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c not found: ID does not exist" containerID="5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c" Apr 24 21:45:11.201454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201397 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c"} err="failed to get container status \"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c\": rpc error: code = NotFound desc = could not find container \"5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c\": container with ID starting with 5999b70e35c9ac8ac0c55dfb10097530bcc1dde0a8f8af9140c028ed7b3e950c not found: ID does not exist" Apr 24 21:45:11.201454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201416 2567 scope.go:117] "RemoveContainer" containerID="f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8" Apr 24 21:45:11.201654 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:11.201633 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8\": container with ID starting with f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8 not found: ID does not exist" containerID="f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8" Apr 24 21:45:11.201712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201656 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8"} err="failed to get container status \"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8\": rpc error: code = NotFound desc = could not find container \"f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8\": container with ID starting with f3e5226edf4ae609b86e247165badfefb15aa17acb311639be007e91d90428c8 not found: ID does not exist" Apr 24 21:45:11.201905 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.201888 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:45:11.206382 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.206363 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-2b4f4-predictor-8485789968-6jf94"] Apr 24 21:45:11.981992 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.981962 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" path="/var/lib/kubelet/pods/0c696334-2346-49ce-8d0b-8991eb2436e4/volumes" Apr 24 21:45:11.982441 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:11.982427 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4560568f-042c-47ee-838e-0ade81f72494" path="/var/lib/kubelet/pods/4560568f-042c-47ee-838e-0ade81f72494/volumes" Apr 24 21:45:16.184414 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.184387 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:45:16.692770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.692737 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:45:16.693087 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693076 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="storage-initializer" Apr 24 21:45:16.693132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693090 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="storage-initializer" Apr 24 21:45:16.693132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693104 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" Apr 24 21:45:16.693132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693111 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" Apr 24 21:45:16.693132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693128 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kube-rbac-proxy" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693133 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kube-rbac-proxy" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693143 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kube-rbac-proxy" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693148 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kube-rbac-proxy" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693155 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="storage-initializer" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693160 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="storage-initializer" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693168 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693173 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693219 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kserve-container" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693226 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4560568f-042c-47ee-838e-0ade81f72494" containerName="kube-rbac-proxy" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693236 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kserve-container" Apr 24 21:45:16.693255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.693243 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c696334-2346-49ce-8d0b-8991eb2436e4" containerName="kube-rbac-proxy" Apr 24 21:45:16.697988 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.697970 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.703999 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.703974 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\"" Apr 24 21:45:16.704114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.703974 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-fb0e6-predictor-serving-cert\"" Apr 24 21:45:16.710247 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.710227 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:45:16.791217 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.791191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhhz2\" (UniqueName: \"kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.791336 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.791258 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.791336 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.791283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.791336 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.791316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.892715 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.892684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.892851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.892731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhhz2\" (UniqueName: \"kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.892851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.892788 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.892851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.892805 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.893006 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:16.892923 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-serving-cert: secret "isvc-logger-raw-fb0e6-predictor-serving-cert" not found Apr 24 21:45:16.893006 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:45:16.892986 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls podName:7dd3c20b-5f46-4349-ba0f-91f74891d4ba nodeName:}" failed. No retries permitted until 2026-04-24 21:45:17.392971148 +0000 UTC m=+1001.934922730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls") pod "isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" (UID: "7dd3c20b-5f46-4349-ba0f-91f74891d4ba") : secret "isvc-logger-raw-fb0e6-predictor-serving-cert" not found Apr 24 21:45:16.893112 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.893051 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.893352 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.893334 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:16.901934 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:16.901903 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhhz2\" (UniqueName: \"kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:17.396318 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:17.396281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:17.398797 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:17.398760 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") pod \"isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:17.608507 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:17.608479 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:17.729682 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:17.729633 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:45:17.732146 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:45:17.732113 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd3c20b_5f46_4349_ba0f_91f74891d4ba.slice/crio-366162a9159172a1adabb419650b145c2abe1c27ae9d627c39b777a4ffa6f724 WatchSource:0}: Error finding container 366162a9159172a1adabb419650b145c2abe1c27ae9d627c39b777a4ffa6f724: Status 404 returned error can't find the container with id 366162a9159172a1adabb419650b145c2abe1c27ae9d627c39b777a4ffa6f724 Apr 24 21:45:18.204071 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:18.204025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerStarted","Data":"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9"} Apr 24 21:45:18.204071 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:18.204068 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerStarted","Data":"366162a9159172a1adabb419650b145c2abe1c27ae9d627c39b777a4ffa6f724"} Apr 24 21:45:22.219503 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:22.219464 2567 generic.go:358] "Generic (PLEG): container finished" podID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerID="78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9" exitCode=0 Apr 24 21:45:22.219872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:22.219546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerDied","Data":"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9"} Apr 24 21:45:23.224911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.224874 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerStarted","Data":"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c"} Apr 24 21:45:23.224911 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.224914 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerStarted","Data":"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70"} Apr 24 21:45:23.225357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.224924 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerStarted","Data":"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c"} Apr 24 21:45:23.225357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.225228 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:23.225357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.225261 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:23.225357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.225272 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:23.226776 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.226743 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:23.227432 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.227397 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:23.247016 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:23.246978 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podStartSLOduration=7.246967193 podStartE2EDuration="7.246967193s" podCreationTimestamp="2026-04-24 21:45:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:45:23.2445604 +0000 UTC m=+1007.786512000" watchObservedRunningTime="2026-04-24 21:45:23.246967193 +0000 UTC m=+1007.788918791" Apr 24 21:45:24.228284 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:24.228242 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:24.228780 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:24.228754 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:29.232106 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:29.232077 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:45:29.232744 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:29.232715 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:29.232982 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:29.232958 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:39.232892 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:39.232855 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:39.233337 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:39.233313 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:49.233139 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:49.233096 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:49.233647 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:49.233563 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:59.233217 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:59.233175 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:45:59.233641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:45:59.233559 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:09.233499 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:09.233452 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:46:09.234001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:09.233968 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:19.232694 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:19.232654 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:46:19.233118 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:19.233084 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:29.233255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:29.233179 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:46:29.233743 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:29.233408 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:46:41.688483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.688443 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn_1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b/kserve-container/0.log" Apr 24 21:46:41.886235 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.886163 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:46:41.886691 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.886657 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" containerID="cri-o://fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c" gracePeriod=30 Apr 24 21:46:41.886832 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.886694 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" containerID="cri-o://3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c" gracePeriod=30 Apr 24 21:46:41.886832 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.886716 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" containerID="cri-o://7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70" gracePeriod=30 Apr 24 21:46:41.911811 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.911789 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:46:41.915419 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.915403 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:41.917258 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.917239 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-e8206-predictor-serving-cert\"" Apr 24 21:46:41.917345 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.917276 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\"" Apr 24 21:46:41.923873 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.923844 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:46:41.951359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.951301 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:41.951359 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.951337 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:41.951739 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.951373 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwnjn\" (UniqueName: \"kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:41.951739 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.951468 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:41.987046 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.987022 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:46:41.987309 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.987284 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kserve-container" containerID="cri-o://910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" gracePeriod=30 Apr 24 21:46:41.987412 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:41.987326 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kube-rbac-proxy" containerID="cri-o://6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" gracePeriod=30 Apr 24 21:46:42.052141 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.052436 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052415 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.052552 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052449 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.052552 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwnjn\" (UniqueName: \"kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.052687 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:46:42.052623 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-serving-cert: secret "isvc-sklearn-scale-raw-e8206-predictor-serving-cert" not found Apr 24 21:46:42.052746 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:46:42.052687 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls podName:f9175222-2aa6-4fc1-90f3-9e45fe75afc3 nodeName:}" failed. No retries permitted until 2026-04-24 21:46:42.552665814 +0000 UTC m=+1087.094617415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls") pod "isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" (UID: "f9175222-2aa6-4fc1-90f3-9e45fe75afc3") : secret "isvc-sklearn-scale-raw-e8206-predictor-serving-cert" not found Apr 24 21:46:42.052746 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052732 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.052912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.052888 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.064808 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.064786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwnjn\" (UniqueName: \"kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.220661 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.220638 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:46:42.255085 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.255052 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtm9d\" (UniqueName: \"kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d\") pod \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " Apr 24 21:46:42.255220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.255134 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " Apr 24 21:46:42.255220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.255197 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls\") pod \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\" (UID: \"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b\") " Apr 24 21:46:42.255611 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.255580 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config") pod "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" (UID: "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b"). InnerVolumeSpecName "message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:42.257031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.257009 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d" (OuterVolumeSpecName: "kube-api-access-gtm9d") pod "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" (UID: "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b"). InnerVolumeSpecName "kube-api-access-gtm9d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:42.257130 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.257108 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" (UID: "1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:42.356454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.356425 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gtm9d\" (UniqueName: \"kubernetes.io/projected/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-kube-api-access-gtm9d\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:46:42.356454 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.356453 2567 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-message-dumper-raw-fb0e6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:46:42.356633 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.356470 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:46:42.492914 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.492837 2567 generic.go:358] "Generic (PLEG): container finished" podID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerID="6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" exitCode=2 Apr 24 21:46:42.492914 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.492862 2567 generic.go:358] "Generic (PLEG): container finished" podID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerID="910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" exitCode=2 Apr 24 21:46:42.493103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.492922 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" Apr 24 21:46:42.493103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.492920 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerDied","Data":"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1"} Apr 24 21:46:42.493103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.493026 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerDied","Data":"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d"} Apr 24 21:46:42.493103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.493042 2567 scope.go:117] "RemoveContainer" containerID="6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" Apr 24 21:46:42.493103 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.493042 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn" event={"ID":"1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b","Type":"ContainerDied","Data":"aa8cb9869ddadfd4c8e4b066bba76c748014a738ea98b83df94363f3253e76ac"} Apr 24 21:46:42.495228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.495209 2567 generic.go:358] "Generic (PLEG): container finished" podID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerID="7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70" exitCode=2 Apr 24 21:46:42.495383 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.495295 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerDied","Data":"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70"} Apr 24 21:46:42.501587 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.501566 2567 scope.go:117] "RemoveContainer" containerID="910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" Apr 24 21:46:42.509120 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509104 2567 scope.go:117] "RemoveContainer" containerID="6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" Apr 24 21:46:42.509371 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:46:42.509351 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1\": container with ID starting with 6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1 not found: ID does not exist" containerID="6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" Apr 24 21:46:42.509425 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509377 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1"} err="failed to get container status \"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1\": rpc error: code = NotFound desc = could not find container \"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1\": container with ID starting with 6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1 not found: ID does not exist" Apr 24 21:46:42.509425 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509399 2567 scope.go:117] "RemoveContainer" containerID="910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" Apr 24 21:46:42.509634 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:46:42.509618 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d\": container with ID starting with 910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d not found: ID does not exist" containerID="910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" Apr 24 21:46:42.509681 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509638 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d"} err="failed to get container status \"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d\": rpc error: code = NotFound desc = could not find container \"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d\": container with ID starting with 910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d not found: ID does not exist" Apr 24 21:46:42.509681 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509652 2567 scope.go:117] "RemoveContainer" containerID="6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1" Apr 24 21:46:42.509843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509827 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1"} err="failed to get container status \"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1\": rpc error: code = NotFound desc = could not find container \"6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1\": container with ID starting with 6b56ea36a000c66dc59731a9186a09c60488c0558214da743fb03a37115c3ee1 not found: ID does not exist" Apr 24 21:46:42.509932 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.509843 2567 scope.go:117] "RemoveContainer" containerID="910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d" Apr 24 21:46:42.510072 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.510056 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d"} err="failed to get container status \"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d\": rpc error: code = NotFound desc = could not find container \"910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d\": container with ID starting with 910c16d2200a28685b3a75e272a60031be1e7619067164f8209c82b596b5f00d not found: ID does not exist" Apr 24 21:46:42.515008 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.514987 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:46:42.519113 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.519095 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-fb0e6-predictor-774f648d7f-fgmbn"] Apr 24 21:46:42.558374 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.558354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.560502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.560479 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") pod \"isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.826142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.826067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:42.944255 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:42.944229 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:46:42.946785 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:46:42.946745 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9175222_2aa6_4fc1_90f3_9e45fe75afc3.slice/crio-37889006e2ea700b306f3ce38d1b50dc583f04fad3517226968efecab884d8e7 WatchSource:0}: Error finding container 37889006e2ea700b306f3ce38d1b50dc583f04fad3517226968efecab884d8e7: Status 404 returned error can't find the container with id 37889006e2ea700b306f3ce38d1b50dc583f04fad3517226968efecab884d8e7 Apr 24 21:46:43.499934 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:43.499896 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerStarted","Data":"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c"} Apr 24 21:46:43.499934 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:43.499939 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerStarted","Data":"37889006e2ea700b306f3ce38d1b50dc583f04fad3517226968efecab884d8e7"} Apr 24 21:46:43.982710 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:43.982676 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" path="/var/lib/kubelet/pods/1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b/volumes" Apr 24 21:46:44.228720 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:44.228684 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:46:46.512478 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:46.512443 2567 generic.go:358] "Generic (PLEG): container finished" podID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerID="fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c" exitCode=0 Apr 24 21:46:46.512865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:46.512510 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerDied","Data":"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c"} Apr 24 21:46:47.516721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:47.516688 2567 generic.go:358] "Generic (PLEG): container finished" podID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerID="6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c" exitCode=0 Apr 24 21:46:47.517152 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:47.516763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerDied","Data":"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c"} Apr 24 21:46:48.522083 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:48.522051 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerStarted","Data":"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6"} Apr 24 21:46:48.522083 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:48.522085 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerStarted","Data":"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e"} Apr 24 21:46:48.522483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:48.522298 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:48.539745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:48.539694 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podStartSLOduration=7.539678611 podStartE2EDuration="7.539678611s" podCreationTimestamp="2026-04-24 21:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:48.538934206 +0000 UTC m=+1093.080885819" watchObservedRunningTime="2026-04-24 21:46:48.539678611 +0000 UTC m=+1093.081630205" Apr 24 21:46:49.229154 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:49.229116 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:46:49.233472 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:49.233449 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:46:49.233868 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:49.233840 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:49.525353 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:49.525278 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:49.526586 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:49.526551 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:46:50.528722 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:50.528684 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:46:54.228621 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:54.228581 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:46:54.229012 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:54.228701 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:46:55.533304 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:55.533259 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:46:55.533892 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:55.533864 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:46:59.229097 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:59.229054 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:46:59.233401 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:59.233364 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:46:59.233784 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:46:59.233756 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:04.228908 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:04.228866 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:47:05.534637 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:05.534600 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:47:09.228909 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:09.228873 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 21:47:09.232749 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:09.232725 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 21:47:09.232853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:09.232836 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:47:09.233079 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:09.233055 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:09.233162 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:09.233151 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:47:12.074481 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.074459 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:47:12.184660 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.184588 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") pod \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " Apr 24 21:47:12.184793 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.184696 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location\") pod \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " Apr 24 21:47:12.184793 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.184741 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhhz2\" (UniqueName: \"kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2\") pod \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " Apr 24 21:47:12.184878 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.184805 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\") pod \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\" (UID: \"7dd3c20b-5f46-4349-ba0f-91f74891d4ba\") " Apr 24 21:47:12.185117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.185037 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7dd3c20b-5f46-4349-ba0f-91f74891d4ba" (UID: "7dd3c20b-5f46-4349-ba0f-91f74891d4ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:12.185215 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.185163 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config") pod "7dd3c20b-5f46-4349-ba0f-91f74891d4ba" (UID: "7dd3c20b-5f46-4349-ba0f-91f74891d4ba"). InnerVolumeSpecName "isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:12.186691 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.186668 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2" (OuterVolumeSpecName: "kube-api-access-lhhz2") pod "7dd3c20b-5f46-4349-ba0f-91f74891d4ba" (UID: "7dd3c20b-5f46-4349-ba0f-91f74891d4ba"). InnerVolumeSpecName "kube-api-access-lhhz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:12.186794 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.186690 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7dd3c20b-5f46-4349-ba0f-91f74891d4ba" (UID: "7dd3c20b-5f46-4349-ba0f-91f74891d4ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:12.285853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.285826 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:47:12.285853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.285851 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:47:12.286000 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.285866 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lhhz2\" (UniqueName: \"kubernetes.io/projected/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-kube-api-access-lhhz2\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:47:12.286000 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.285880 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7dd3c20b-5f46-4349-ba0f-91f74891d4ba-isvc-logger-raw-fb0e6-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:47:12.608275 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.608239 2567 generic.go:358] "Generic (PLEG): container finished" podID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerID="3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c" exitCode=0 Apr 24 21:47:12.608471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.608331 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerDied","Data":"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c"} Apr 24 21:47:12.608471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.608374 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" event={"ID":"7dd3c20b-5f46-4349-ba0f-91f74891d4ba","Type":"ContainerDied","Data":"366162a9159172a1adabb419650b145c2abe1c27ae9d627c39b777a4ffa6f724"} Apr 24 21:47:12.608471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.608372 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9" Apr 24 21:47:12.608471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.608388 2567 scope.go:117] "RemoveContainer" containerID="3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c" Apr 24 21:47:12.616905 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.616893 2567 scope.go:117] "RemoveContainer" containerID="7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70" Apr 24 21:47:12.624117 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.624100 2567 scope.go:117] "RemoveContainer" containerID="fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c" Apr 24 21:47:12.631916 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.631891 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:47:12.632119 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.632106 2567 scope.go:117] "RemoveContainer" containerID="78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9" Apr 24 21:47:12.636129 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.636110 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-fb0e6-predictor-77dc655476-ltfd9"] Apr 24 21:47:12.639105 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639090 2567 scope.go:117] "RemoveContainer" containerID="3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c" Apr 24 21:47:12.639373 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:47:12.639351 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c\": container with ID starting with 3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c not found: ID does not exist" containerID="3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c" Apr 24 21:47:12.639429 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639386 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c"} err="failed to get container status \"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c\": rpc error: code = NotFound desc = could not find container \"3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c\": container with ID starting with 3b2bd4b63489b187845e2119517946431aa8a745fe597ecf66bc187909c82b8c not found: ID does not exist" Apr 24 21:47:12.639429 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639404 2567 scope.go:117] "RemoveContainer" containerID="7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70" Apr 24 21:47:12.639625 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:47:12.639610 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70\": container with ID starting with 7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70 not found: ID does not exist" containerID="7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70" Apr 24 21:47:12.639683 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639628 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70"} err="failed to get container status \"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70\": rpc error: code = NotFound desc = could not find container \"7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70\": container with ID starting with 7f28d08588469385e9a2fc08ae38bbb7e18bdabca10e34e18f33d15f0ef58d70 not found: ID does not exist" Apr 24 21:47:12.639683 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639641 2567 scope.go:117] "RemoveContainer" containerID="fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c" Apr 24 21:47:12.639823 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:47:12.639803 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c\": container with ID starting with fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c not found: ID does not exist" containerID="fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c" Apr 24 21:47:12.639886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639832 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c"} err="failed to get container status \"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c\": rpc error: code = NotFound desc = could not find container \"fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c\": container with ID starting with fb1ac66d8dbbea2a7467efe59c0c612d23944d15cbb7eff81d6ffa14b11c190c not found: ID does not exist" Apr 24 21:47:12.639886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.639856 2567 scope.go:117] "RemoveContainer" containerID="78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9" Apr 24 21:47:12.640055 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:47:12.640036 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9\": container with ID starting with 78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9 not found: ID does not exist" containerID="78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9" Apr 24 21:47:12.640095 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:12.640059 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9"} err="failed to get container status \"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9\": rpc error: code = NotFound desc = could not find container \"78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9\": container with ID starting with 78809c1bf5d6a5805488d3b9878cc2f3c136ad8b5a8a6431b5417a2b19c940c9 not found: ID does not exist" Apr 24 21:47:13.982286 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:13.982243 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" path="/var/lib/kubelet/pods/7dd3c20b-5f46-4349-ba0f-91f74891d4ba/volumes" Apr 24 21:47:15.534376 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:15.534340 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:47:25.534395 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:25.534360 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:47:35.534384 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:35.534347 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:47:45.537934 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:45.537890 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:47:55.534363 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:47:55.534278 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:00.978502 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:00.978455 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:10.978656 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:10.978616 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:20.979214 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:20.979174 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:30.978817 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:30.978772 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:35.972925 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:35.972893 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:48:35.975403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:35.975377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:48:40.978659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:40.978624 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:48:50.979230 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:48:50.979187 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 21:49:00.979259 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:00.979232 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:49:02.116354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.116319 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:49:02.116745 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.116716 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" containerID="cri-o://a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e" gracePeriod=30 Apr 24 21:49:02.116830 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.116800 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kube-rbac-proxy" containerID="cri-o://af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6" gracePeriod=30 Apr 24 21:49:02.222151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222124 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:49:02.222713 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222694 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" Apr 24 21:49:02.222770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222722 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" Apr 24 21:49:02.222770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222756 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" Apr 24 21:49:02.222843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222771 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" Apr 24 21:49:02.222843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222781 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" Apr 24 21:49:02.222843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222793 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" Apr 24 21:49:02.222843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222809 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kube-rbac-proxy" Apr 24 21:49:02.222843 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222820 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kube-rbac-proxy" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222842 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="storage-initializer" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222853 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="storage-initializer" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222866 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kserve-container" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222879 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kserve-container" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222965 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="agent" Apr 24 21:49:02.222994 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222979 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kube-rbac-proxy" Apr 24 21:49:02.223185 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.222997 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kube-rbac-proxy" Apr 24 21:49:02.223185 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.223012 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7dd3c20b-5f46-4349-ba0f-91f74891d4ba" containerName="kserve-container" Apr 24 21:49:02.223185 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.223031 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cfcc287-0d5d-4ba7-be93-a76c1ad74d7b" containerName="kserve-container" Apr 24 21:49:02.226173 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.226151 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.228750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.228730 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-cfc742-kube-rbac-proxy-sar-config\"" Apr 24 21:49:02.229215 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.228766 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-cfc742-predictor-serving-cert\"" Apr 24 21:49:02.230512 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.230477 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:49:02.315193 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.315164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.315193 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.315195 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxcdl\" (UniqueName: \"kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.315349 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.315217 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.315349 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.315304 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.416712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.416650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.416712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.416680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxcdl\" (UniqueName: \"kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.416712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.416700 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.416970 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.416737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.417099 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.417079 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.417437 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.417415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.419273 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.419254 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.425750 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.425724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxcdl\" (UniqueName: \"kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl\") pod \"isvc-primary-cfc742-predictor-6f9c45779c-m9v74\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.537464 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.537435 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:02.866077 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.865964 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:49:02.868295 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:49:02.868269 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7425b4fb_1133_4845_89d5_240d7bb4dc45.slice/crio-090b25f605cd7bb12efa2c3f0979357e83e7b09882f791366a4aadbcd31f909c WatchSource:0}: Error finding container 090b25f605cd7bb12efa2c3f0979357e83e7b09882f791366a4aadbcd31f909c: Status 404 returned error can't find the container with id 090b25f605cd7bb12efa2c3f0979357e83e7b09882f791366a4aadbcd31f909c Apr 24 21:49:02.969424 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.969384 2567 generic.go:358] "Generic (PLEG): container finished" podID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerID="af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6" exitCode=2 Apr 24 21:49:02.969588 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.969457 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerDied","Data":"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6"} Apr 24 21:49:02.970857 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.970833 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerStarted","Data":"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8"} Apr 24 21:49:02.970966 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:02.970860 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerStarted","Data":"090b25f605cd7bb12efa2c3f0979357e83e7b09882f791366a4aadbcd31f909c"} Apr 24 21:49:05.528846 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:05.528806 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 24 21:49:06.985869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:06.985834 2567 generic.go:358] "Generic (PLEG): container finished" podID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerID="c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8" exitCode=0 Apr 24 21:49:06.986258 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:06.985904 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerDied","Data":"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8"} Apr 24 21:49:07.990640 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:07.990600 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerStarted","Data":"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3"} Apr 24 21:49:07.991011 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:07.990646 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerStarted","Data":"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550"} Apr 24 21:49:07.991011 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:07.990845 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:08.012435 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:08.012388 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podStartSLOduration=6.012373376 podStartE2EDuration="6.012373376s" podCreationTimestamp="2026-04-24 21:49:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:49:08.010179979 +0000 UTC m=+1232.552131580" watchObservedRunningTime="2026-04-24 21:49:08.012373376 +0000 UTC m=+1232.554325005" Apr 24 21:49:08.994021 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:08.993911 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:09.004449 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:08.995114 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:09.996565 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:09.996512 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:10.459976 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.459954 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:49:10.480088 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.480062 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwnjn\" (UniqueName: \"kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn\") pod \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " Apr 24 21:49:10.480225 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.480204 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") pod \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " Apr 24 21:49:10.480298 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.480252 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location\") pod \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " Apr 24 21:49:10.480360 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.480298 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\") pod \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\" (UID: \"f9175222-2aa6-4fc1-90f3-9e45fe75afc3\") " Apr 24 21:49:10.481045 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.480990 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9175222-2aa6-4fc1-90f3-9e45fe75afc3" (UID: "f9175222-2aa6-4fc1-90f3-9e45fe75afc3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:10.481045 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.481007 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config") pod "f9175222-2aa6-4fc1-90f3-9e45fe75afc3" (UID: "f9175222-2aa6-4fc1-90f3-9e45fe75afc3"). InnerVolumeSpecName "isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:49:10.483057 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.483031 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f9175222-2aa6-4fc1-90f3-9e45fe75afc3" (UID: "f9175222-2aa6-4fc1-90f3-9e45fe75afc3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:10.483618 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.483589 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn" (OuterVolumeSpecName: "kube-api-access-qwnjn") pod "f9175222-2aa6-4fc1-90f3-9e45fe75afc3" (UID: "f9175222-2aa6-4fc1-90f3-9e45fe75afc3"). InnerVolumeSpecName "kube-api-access-qwnjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:10.581888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.581859 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:49:10.581888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.581886 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:49:10.582033 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.581898 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-isvc-sklearn-scale-raw-e8206-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:49:10.582033 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:10.581909 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwnjn\" (UniqueName: \"kubernetes.io/projected/f9175222-2aa6-4fc1-90f3-9e45fe75afc3-kube-api-access-qwnjn\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:49:11.001730 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.001694 2567 generic.go:358] "Generic (PLEG): container finished" podID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerID="a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e" exitCode=0 Apr 24 21:49:11.002120 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.001775 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerDied","Data":"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e"} Apr 24 21:49:11.002120 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.001811 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" event={"ID":"f9175222-2aa6-4fc1-90f3-9e45fe75afc3","Type":"ContainerDied","Data":"37889006e2ea700b306f3ce38d1b50dc583f04fad3517226968efecab884d8e7"} Apr 24 21:49:11.002120 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.001826 2567 scope.go:117] "RemoveContainer" containerID="af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6" Apr 24 21:49:11.002120 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.001782 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5" Apr 24 21:49:11.011093 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.011072 2567 scope.go:117] "RemoveContainer" containerID="a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e" Apr 24 21:49:11.019090 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.019072 2567 scope.go:117] "RemoveContainer" containerID="6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c" Apr 24 21:49:11.026725 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.026659 2567 scope.go:117] "RemoveContainer" containerID="af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6" Apr 24 21:49:11.027123 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:49:11.027097 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6\": container with ID starting with af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6 not found: ID does not exist" containerID="af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6" Apr 24 21:49:11.027234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.027133 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6"} err="failed to get container status \"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6\": rpc error: code = NotFound desc = could not find container \"af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6\": container with ID starting with af9620953c8079917dca33ddecb77a6ad2a4fe191ac6e97fe760a154a4568cb6 not found: ID does not exist" Apr 24 21:49:11.027234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.027201 2567 scope.go:117] "RemoveContainer" containerID="a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e" Apr 24 21:49:11.027539 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:49:11.027497 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e\": container with ID starting with a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e not found: ID does not exist" containerID="a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e" Apr 24 21:49:11.027630 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.027553 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e"} err="failed to get container status \"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e\": rpc error: code = NotFound desc = could not find container \"a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e\": container with ID starting with a60d07fa9056efa40ab2384d88630f12e87b1cc04fa2e162285a3f362ee98c2e not found: ID does not exist" Apr 24 21:49:11.027630 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.027575 2567 scope.go:117] "RemoveContainer" containerID="6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c" Apr 24 21:49:11.027909 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:49:11.027888 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c\": container with ID starting with 6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c not found: ID does not exist" containerID="6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c" Apr 24 21:49:11.027986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.027917 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c"} err="failed to get container status \"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c\": rpc error: code = NotFound desc = could not find container \"6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c\": container with ID starting with 6eb68d22717c32e0202f6883008afcff4d9678c7a8a1df844fb9e23c0b9ec40c not found: ID does not exist" Apr 24 21:49:11.029718 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.029699 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:49:11.033304 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.033281 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-e8206-predictor-6b9744b894-fphp5"] Apr 24 21:49:11.982575 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:11.982545 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" path="/var/lib/kubelet/pods/f9175222-2aa6-4fc1-90f3-9e45fe75afc3/volumes" Apr 24 21:49:15.000986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:15.000955 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:49:15.001428 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:15.001402 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:25.002105 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:25.002018 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:35.001654 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:35.001613 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:45.001552 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:45.001500 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:49:55.002179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:49:55.002136 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:50:05.002258 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:05.002222 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 21:50:15.002500 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:15.002473 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:50:22.323373 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323330 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323847 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="storage-initializer" Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323867 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="storage-initializer" Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323897 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kube-rbac-proxy" Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323907 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kube-rbac-proxy" Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323926 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" Apr 24 21:50:22.323942 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.323934 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" Apr 24 21:50:22.324264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.324020 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kserve-container" Apr 24 21:50:22.324264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.324034 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9175222-2aa6-4fc1-90f3-9e45fe75afc3" containerName="kube-rbac-proxy" Apr 24 21:50:22.327607 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.327578 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.329725 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.329705 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\"" Apr 24 21:50:22.329972 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.329954 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-cfc742-dockercfg-h76pt\"" Apr 24 21:50:22.330144 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.330124 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-cfc742\"" Apr 24 21:50:22.330202 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.330146 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 21:50:22.330202 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.330183 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-cfc742-predictor-serving-cert\"" Apr 24 21:50:22.331744 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.331724 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkx7r\" (UniqueName: \"kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.331869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.331759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.331869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.331835 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.331869 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.331862 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.332032 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.331900 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.338270 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.338248 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:22.432659 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.432636 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkx7r\" (UniqueName: \"kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.432791 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.432673 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.432791 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.432707 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.432791 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:22.432786 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-serving-cert: secret "isvc-secondary-cfc742-predictor-serving-cert" not found Apr 24 21:50:22.432986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.432825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.432986 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:22.432835 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls podName:fbc34c10-26df-40e0-8b89-0615b9911212 nodeName:}" failed. No retries permitted until 2026-04-24 21:50:22.932822144 +0000 UTC m=+1307.474773720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls") pod "isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212") : secret "isvc-secondary-cfc742-predictor-serving-cert" not found Apr 24 21:50:22.432986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.432875 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.433233 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.433210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.433421 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.433400 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.433483 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.433427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.441936 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.441914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkx7r\" (UniqueName: \"kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.937774 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.937739 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:22.940182 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:22.940153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:23.238763 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:23.238678 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:23.358784 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:23.358758 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:23.361361 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:50:23.361329 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbc34c10_26df_40e0_8b89_0615b9911212.slice/crio-6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c WatchSource:0}: Error finding container 6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c: Status 404 returned error can't find the container with id 6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c Apr 24 21:50:23.363241 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:23.363224 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:50:24.249981 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:24.249940 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerStarted","Data":"d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559"} Apr 24 21:50:24.249981 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:24.249986 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerStarted","Data":"6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c"} Apr 24 21:50:29.267872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:29.267845 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/0.log" Apr 24 21:50:29.268251 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:29.267886 2567 generic.go:358] "Generic (PLEG): container finished" podID="fbc34c10-26df-40e0-8b89-0615b9911212" containerID="d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559" exitCode=1 Apr 24 21:50:29.268251 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:29.267960 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerDied","Data":"d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559"} Apr 24 21:50:30.273130 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:30.273102 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/0.log" Apr 24 21:50:30.273508 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:30.273199 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerStarted","Data":"8566aa95082940f6ecc049717f46cd3e6da1ff3af8216b511d22f432cb3e72e0"} Apr 24 21:50:35.289975 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.289948 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/1.log" Apr 24 21:50:35.290330 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.290314 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/0.log" Apr 24 21:50:35.290373 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.290350 2567 generic.go:358] "Generic (PLEG): container finished" podID="fbc34c10-26df-40e0-8b89-0615b9911212" containerID="8566aa95082940f6ecc049717f46cd3e6da1ff3af8216b511d22f432cb3e72e0" exitCode=1 Apr 24 21:50:35.290424 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.290406 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerDied","Data":"8566aa95082940f6ecc049717f46cd3e6da1ff3af8216b511d22f432cb3e72e0"} Apr 24 21:50:35.290471 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.290448 2567 scope.go:117] "RemoveContainer" containerID="d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559" Apr 24 21:50:35.290798 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:35.290776 2567 scope.go:117] "RemoveContainer" containerID="d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559" Apr 24 21:50:35.301281 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:35.301255 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_kserve-ci-e2e-test_fbc34c10-26df-40e0-8b89-0615b9911212_0 in pod sandbox 6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c from index: no such id: 'd1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559'" containerID="d1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559" Apr 24 21:50:35.301352 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:35.301299 2567 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_kserve-ci-e2e-test_fbc34c10-26df-40e0-8b89-0615b9911212_0 in pod sandbox 6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c from index: no such id: 'd1737931e2e9467ea8b10890dad69320ef54a89e3b3be8eda3b2b02fb8cd4559'; Skipping pod \"isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_kserve-ci-e2e-test(fbc34c10-26df-40e0-8b89-0615b9911212)\"" logger="UnhandledError" Apr 24 21:50:35.302627 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:35.302606 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_kserve-ci-e2e-test(fbc34c10-26df-40e0-8b89-0615b9911212)\"" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" Apr 24 21:50:36.294142 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:36.294114 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/1.log" Apr 24 21:50:40.387378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.387348 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:40.444108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.444074 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:50:40.444712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.444620 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" containerID="cri-o://71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550" gracePeriod=30 Apr 24 21:50:40.444712 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.444648 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kube-rbac-proxy" containerID="cri-o://69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3" gracePeriod=30 Apr 24 21:50:40.496952 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.496924 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:40.502079 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.502056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.504197 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.504173 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a7932c-predictor-serving-cert\"" Apr 24 21:50:40.504197 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.504181 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\"" Apr 24 21:50:40.504400 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.504181 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-a7932c\"" Apr 24 21:50:40.504400 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.504273 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-a7932c-dockercfg-dcl94\"" Apr 24 21:50:40.509167 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.509147 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:40.565266 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.565249 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/1.log" Apr 24 21:50:40.565364 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.565306 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:40.572979 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.572957 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.573071 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.572988 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w798\" (UniqueName: \"kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.573071 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.573014 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.573150 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.573134 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.573196 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.573161 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674482 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674414 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkx7r\" (UniqueName: \"kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r\") pod \"fbc34c10-26df-40e0-8b89-0615b9911212\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " Apr 24 21:50:40.674482 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674461 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") pod \"fbc34c10-26df-40e0-8b89-0615b9911212\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " Apr 24 21:50:40.674482 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674481 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert\") pod \"fbc34c10-26df-40e0-8b89-0615b9911212\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " Apr 24 21:50:40.674706 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674507 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config\") pod \"fbc34c10-26df-40e0-8b89-0615b9911212\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " Apr 24 21:50:40.674706 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674588 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location\") pod \"fbc34c10-26df-40e0-8b89-0615b9911212\" (UID: \"fbc34c10-26df-40e0-8b89-0615b9911212\") " Apr 24 21:50:40.674706 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674686 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674715 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674752 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674783 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w798\" (UniqueName: \"kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674826 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fbc34c10-26df-40e0-8b89-0615b9911212" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:40.674872 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674860 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fbc34c10-26df-40e0-8b89-0615b9911212" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:40.675187 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674955 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-cfc742-kube-rbac-proxy-sar-config") pod "fbc34c10-26df-40e0-8b89-0615b9911212" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212"). InnerVolumeSpecName "isvc-secondary-cfc742-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:40.675187 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674970 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-cabundle-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.675187 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.674992 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fbc34c10-26df-40e0-8b89-0615b9911212-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.675390 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.675357 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.675449 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.675415 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.675511 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.675494 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.676697 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.676667 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fbc34c10-26df-40e0-8b89-0615b9911212" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:40.676796 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.676772 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r" (OuterVolumeSpecName: "kube-api-access-lkx7r") pod "fbc34c10-26df-40e0-8b89-0615b9911212" (UID: "fbc34c10-26df-40e0-8b89-0615b9911212"). InnerVolumeSpecName "kube-api-access-lkx7r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:40.677302 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.677284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.683700 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.683677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w798\" (UniqueName: \"kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798\") pod \"isvc-init-fail-a7932c-predictor-648777d896-sq28d\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.775853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.775825 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkx7r\" (UniqueName: \"kubernetes.io/projected/fbc34c10-26df-40e0-8b89-0615b9911212-kube-api-access-lkx7r\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.775853 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.775849 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbc34c10-26df-40e0-8b89-0615b9911212-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.775978 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.775865 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fbc34c10-26df-40e0-8b89-0615b9911212-isvc-secondary-cfc742-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.819641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.819619 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:40.943679 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:40.943480 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:40.946196 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:50:40.946169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b64a1d_9f71_4364_9f9e_e76d7789ecf2.slice/crio-bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93 WatchSource:0}: Error finding container bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93: Status 404 returned error can't find the container with id bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93 Apr 24 21:50:41.310970 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.310894 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-cfc742-predictor-68cdb59558-vm6gb_fbc34c10-26df-40e0-8b89-0615b9911212/storage-initializer/1.log" Apr 24 21:50:41.311113 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.311006 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" Apr 24 21:50:41.311113 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.311005 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb" event={"ID":"fbc34c10-26df-40e0-8b89-0615b9911212","Type":"ContainerDied","Data":"6c4c97f2c918ec2e9716520db6d8ee2518a02cae9ebb8c7d8c4599de820bea4c"} Apr 24 21:50:41.311113 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.311048 2567 scope.go:117] "RemoveContainer" containerID="8566aa95082940f6ecc049717f46cd3e6da1ff3af8216b511d22f432cb3e72e0" Apr 24 21:50:41.312677 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.312634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerStarted","Data":"cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11"} Apr 24 21:50:41.312810 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.312681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerStarted","Data":"bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93"} Apr 24 21:50:41.314929 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.314895 2567 generic.go:358] "Generic (PLEG): container finished" podID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerID="69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3" exitCode=2 Apr 24 21:50:41.315031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.314942 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerDied","Data":"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3"} Apr 24 21:50:41.363018 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.362989 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:41.366350 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.366325 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-cfc742-predictor-68cdb59558-vm6gb"] Apr 24 21:50:41.982867 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:41.982832 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" path="/var/lib/kubelet/pods/fbc34c10-26df-40e0-8b89-0615b9911212/volumes" Apr 24 21:50:44.287718 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.287695 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:50:44.327473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.327441 2567 generic.go:358] "Generic (PLEG): container finished" podID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerID="71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550" exitCode=0 Apr 24 21:50:44.327625 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.327537 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" Apr 24 21:50:44.327625 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.327536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerDied","Data":"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550"} Apr 24 21:50:44.327751 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.327643 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74" event={"ID":"7425b4fb-1133-4845-89d5-240d7bb4dc45","Type":"ContainerDied","Data":"090b25f605cd7bb12efa2c3f0979357e83e7b09882f791366a4aadbcd31f909c"} Apr 24 21:50:44.327751 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.327660 2567 scope.go:117] "RemoveContainer" containerID="69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3" Apr 24 21:50:44.335193 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.335180 2567 scope.go:117] "RemoveContainer" containerID="71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550" Apr 24 21:50:44.342183 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.342168 2567 scope.go:117] "RemoveContainer" containerID="c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8" Apr 24 21:50:44.348633 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.348618 2567 scope.go:117] "RemoveContainer" containerID="69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3" Apr 24 21:50:44.348871 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:44.348854 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3\": container with ID starting with 69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3 not found: ID does not exist" containerID="69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3" Apr 24 21:50:44.348917 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.348879 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3"} err="failed to get container status \"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3\": rpc error: code = NotFound desc = could not find container \"69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3\": container with ID starting with 69323a4760b6c5340ee0f103be5d69b7c7cba27240d592e76ee1da7926c3bed3 not found: ID does not exist" Apr 24 21:50:44.348917 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.348896 2567 scope.go:117] "RemoveContainer" containerID="71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550" Apr 24 21:50:44.349112 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:44.349093 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550\": container with ID starting with 71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550 not found: ID does not exist" containerID="71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550" Apr 24 21:50:44.349179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.349120 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550"} err="failed to get container status \"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550\": rpc error: code = NotFound desc = could not find container \"71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550\": container with ID starting with 71a58aaadb3f4df1936573c0985285f722b9343f713d81304d3827567a82a550 not found: ID does not exist" Apr 24 21:50:44.349179 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.349146 2567 scope.go:117] "RemoveContainer" containerID="c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8" Apr 24 21:50:44.349356 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:44.349342 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8\": container with ID starting with c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8 not found: ID does not exist" containerID="c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8" Apr 24 21:50:44.349402 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.349361 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8"} err="failed to get container status \"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8\": rpc error: code = NotFound desc = could not find container \"c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8\": container with ID starting with c6e4404f55b0087740174e0d21f75068796720c6163b8ce81719724f216175a8 not found: ID does not exist" Apr 24 21:50:44.404855 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.404799 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config\") pod \"7425b4fb-1133-4845-89d5-240d7bb4dc45\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " Apr 24 21:50:44.404947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.404866 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxcdl\" (UniqueName: \"kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl\") pod \"7425b4fb-1133-4845-89d5-240d7bb4dc45\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " Apr 24 21:50:44.404947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.404905 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location\") pod \"7425b4fb-1133-4845-89d5-240d7bb4dc45\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " Apr 24 21:50:44.404947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.404931 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls\") pod \"7425b4fb-1133-4845-89d5-240d7bb4dc45\" (UID: \"7425b4fb-1133-4845-89d5-240d7bb4dc45\") " Apr 24 21:50:44.405189 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.405163 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-cfc742-kube-rbac-proxy-sar-config") pod "7425b4fb-1133-4845-89d5-240d7bb4dc45" (UID: "7425b4fb-1133-4845-89d5-240d7bb4dc45"). InnerVolumeSpecName "isvc-primary-cfc742-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:44.405235 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.405174 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7425b4fb-1133-4845-89d5-240d7bb4dc45" (UID: "7425b4fb-1133-4845-89d5-240d7bb4dc45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:44.406865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.406842 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl" (OuterVolumeSpecName: "kube-api-access-hxcdl") pod "7425b4fb-1133-4845-89d5-240d7bb4dc45" (UID: "7425b4fb-1133-4845-89d5-240d7bb4dc45"). InnerVolumeSpecName "kube-api-access-hxcdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:44.406947 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.406852 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7425b4fb-1133-4845-89d5-240d7bb4dc45" (UID: "7425b4fb-1133-4845-89d5-240d7bb4dc45"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:44.506511 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.506484 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-cfc742-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7425b4fb-1133-4845-89d5-240d7bb4dc45-isvc-primary-cfc742-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:44.506511 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.506508 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxcdl\" (UniqueName: \"kubernetes.io/projected/7425b4fb-1133-4845-89d5-240d7bb4dc45-kube-api-access-hxcdl\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:44.506658 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.506518 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7425b4fb-1133-4845-89d5-240d7bb4dc45-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:44.506658 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.506553 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7425b4fb-1133-4845-89d5-240d7bb4dc45-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:44.652592 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.652564 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:50:44.658238 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:44.658184 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-cfc742-predictor-6f9c45779c-m9v74"] Apr 24 21:50:45.982803 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:45.982724 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" path="/var/lib/kubelet/pods/7425b4fb-1133-4845-89d5-240d7bb4dc45/volumes" Apr 24 21:50:46.335685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:46.335608 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/0.log" Apr 24 21:50:46.335685 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:46.335648 2567 generic.go:358] "Generic (PLEG): container finished" podID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerID="cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11" exitCode=1 Apr 24 21:50:46.335903 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:46.335692 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerDied","Data":"cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11"} Apr 24 21:50:47.341848 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:47.341821 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/0.log" Apr 24 21:50:47.342304 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:47.341907 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerStarted","Data":"19fcc36a104e51ccc8da6b6b78a10972c060ef10cc0a9b9fb4d88e92c44f560b"} Apr 24 21:50:49.350419 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.350396 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/1.log" Apr 24 21:50:49.350761 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.350744 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/0.log" Apr 24 21:50:49.350812 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.350779 2567 generic.go:358] "Generic (PLEG): container finished" podID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerID="19fcc36a104e51ccc8da6b6b78a10972c060ef10cc0a9b9fb4d88e92c44f560b" exitCode=1 Apr 24 21:50:49.350886 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.350864 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerDied","Data":"19fcc36a104e51ccc8da6b6b78a10972c060ef10cc0a9b9fb4d88e92c44f560b"} Apr 24 21:50:49.350945 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.350918 2567 scope.go:117] "RemoveContainer" containerID="cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11" Apr 24 21:50:49.351354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.351331 2567 scope.go:117] "RemoveContainer" containerID="cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11" Apr 24 21:50:49.378119 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:49.378087 2567 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-a7932c-predictor-648777d896-sq28d_kserve-ci-e2e-test_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2_0 in pod sandbox bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93 from index: no such id: 'cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11'" containerID="cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11" Apr 24 21:50:49.378194 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:49.378126 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-a7932c-predictor-648777d896-sq28d_kserve-ci-e2e-test_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2_0 in pod sandbox bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93 from index: no such id: 'cf29ccaed740bc276963e9dfe7a65e50212e8ab60e587185b13cfc12d41e0d11'" Apr 24 21:50:49.378317 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:50:49.378299 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-a7932c-predictor-648777d896-sq28d_kserve-ci-e2e-test(c9b64a1d-9f71-4364-9f9e-e76d7789ecf2)\"" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" Apr 24 21:50:50.355162 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.355134 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/1.log" Apr 24 21:50:50.508004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.507959 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:50.645720 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.645697 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/1.log" Apr 24 21:50:50.645830 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.645765 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:50.648669 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.648643 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649077 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649107 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649122 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kube-rbac-proxy" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649130 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kube-rbac-proxy" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649141 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649150 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649162 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649170 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649185 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649193 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649207 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649215 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649238 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649249 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649344 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649362 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kserve-container" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649377 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649400 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649412 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="7425b4fb-1133-4845-89d5-240d7bb4dc45" containerName="kube-rbac-proxy" Apr 24 21:50:50.649629 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.649420 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbc34c10-26df-40e0-8b89-0615b9911212" containerName="storage-initializer" Apr 24 21:50:50.654985 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.654966 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.657001 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.656979 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-1ac54-predictor-serving-cert\"" Apr 24 21:50:50.657108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.656981 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\"" Apr 24 21:50:50.657108 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.657053 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-mq7qd\"" Apr 24 21:50:50.660344 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.660321 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:50:50.751000 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.750977 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location\") pod \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " Apr 24 21:50:50.751114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751008 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls\") pod \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " Apr 24 21:50:50.751114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751044 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert\") pod \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " Apr 24 21:50:50.751114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751108 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w798\" (UniqueName: \"kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798\") pod \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " Apr 24 21:50:50.751234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751146 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\") pod \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\" (UID: \"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2\") " Apr 24 21:50:50.751299 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751273 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.751299 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751286 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" (UID: "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:50.751417 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.751489 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751469 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" (UID: "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:50.751580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751483 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxgb\" (UniqueName: \"kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.751580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751567 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.751690 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751611 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:50.751690 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751626 2567 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-cabundle-cert\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:50.751690 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.751638 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-a7932c-kube-rbac-proxy-sar-config") pod "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" (UID: "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2"). InnerVolumeSpecName "isvc-init-fail-a7932c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:50:50.753102 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.753073 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798" (OuterVolumeSpecName: "kube-api-access-2w798") pod "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" (UID: "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2"). InnerVolumeSpecName "kube-api-access-2w798". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:50.753204 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.753111 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" (UID: "c9b64a1d-9f71-4364-9f9e-e76d7789ecf2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:50.851964 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.851934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.852107 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.851985 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.852107 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852032 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.852107 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852102 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxgb\" (UniqueName: \"kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.852289 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852139 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2w798\" (UniqueName: \"kubernetes.io/projected/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-kube-api-access-2w798\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:50.852289 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852158 2567 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-isvc-init-fail-a7932c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:50.852289 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852175 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:50:50.852422 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.852752 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.852727 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.854473 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.854452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.860580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.860551 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxgb\" (UniqueName: \"kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb\") pod \"raw-sklearn-1ac54-predictor-995c89dc6-45xgf\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:50.967674 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:50.967614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:51.290367 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.290344 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:50:51.292087 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:50:51.292062 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb033b505_1678_4e29_bb5b_ae293c77330c.slice/crio-6481464f0035d7e06b90eddc72fa11fa384c631c79266fa132b25f524c9d18e7 WatchSource:0}: Error finding container 6481464f0035d7e06b90eddc72fa11fa384c631c79266fa132b25f524c9d18e7: Status 404 returned error can't find the container with id 6481464f0035d7e06b90eddc72fa11fa384c631c79266fa132b25f524c9d18e7 Apr 24 21:50:51.360283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.360261 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-a7932c-predictor-648777d896-sq28d_c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/storage-initializer/1.log" Apr 24 21:50:51.360610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.360382 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" event={"ID":"c9b64a1d-9f71-4364-9f9e-e76d7789ecf2","Type":"ContainerDied","Data":"bb79b13b3c9dbe6ac45c732e413679402aa7e0e841afd4ab8c1b76a8cc48fd93"} Apr 24 21:50:51.360610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.360401 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d" Apr 24 21:50:51.360610 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.360422 2567 scope.go:117] "RemoveContainer" containerID="19fcc36a104e51ccc8da6b6b78a10972c060ef10cc0a9b9fb4d88e92c44f560b" Apr 24 21:50:51.361931 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.361910 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerStarted","Data":"e9f33680adbc3b8bfb3a970685316067c265ac5a62c0dac41094696a0b7328e2"} Apr 24 21:50:51.362028 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.361937 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerStarted","Data":"6481464f0035d7e06b90eddc72fa11fa384c631c79266fa132b25f524c9d18e7"} Apr 24 21:50:51.414348 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.414322 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:51.427111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.427078 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-a7932c-predictor-648777d896-sq28d"] Apr 24 21:50:51.982641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:51.982604 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b64a1d-9f71-4364-9f9e-e76d7789ecf2" path="/var/lib/kubelet/pods/c9b64a1d-9f71-4364-9f9e-e76d7789ecf2/volumes" Apr 24 21:50:55.376792 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:55.376758 2567 generic.go:358] "Generic (PLEG): container finished" podID="b033b505-1678-4e29-bb5b-ae293c77330c" containerID="e9f33680adbc3b8bfb3a970685316067c265ac5a62c0dac41094696a0b7328e2" exitCode=0 Apr 24 21:50:55.377208 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:55.376831 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerDied","Data":"e9f33680adbc3b8bfb3a970685316067c265ac5a62c0dac41094696a0b7328e2"} Apr 24 21:50:56.382062 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:56.382034 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerStarted","Data":"a82b719cc651db850d77936995999644ab5e7862e4960800fdff128d1ae11921"} Apr 24 21:50:56.382403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:56.382068 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerStarted","Data":"ea8c56f71e50166e925eacff9b26ebbf3460372f84146f2fe0e2fb98ad4adfc2"} Apr 24 21:50:56.382403 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:56.382259 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:56.403917 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:56.403864 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podStartSLOduration=6.403850201 podStartE2EDuration="6.403850201s" podCreationTimestamp="2026-04-24 21:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:56.402384716 +0000 UTC m=+1340.944336315" watchObservedRunningTime="2026-04-24 21:50:56.403850201 +0000 UTC m=+1340.945801799" Apr 24 21:50:57.385132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:57.385099 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:50:57.386398 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:57.386369 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:50:58.388082 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:50:58.388029 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:03.392460 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:03.392432 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:51:03.393048 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:03.393016 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:13.393231 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:13.393189 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:23.393427 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:23.393381 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:33.393172 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:33.393134 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:43.393042 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:43.392997 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:51:53.393360 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:51:53.393319 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:52:03.394404 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:03.394375 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:52:10.794393 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.794358 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:52:10.794980 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.794778 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" containerID="cri-o://ea8c56f71e50166e925eacff9b26ebbf3460372f84146f2fe0e2fb98ad4adfc2" gracePeriod=30 Apr 24 21:52:10.794980 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.794830 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kube-rbac-proxy" containerID="cri-o://a82b719cc651db850d77936995999644ab5e7862e4960800fdff128d1ae11921" gracePeriod=30 Apr 24 21:52:10.938263 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.938231 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:52:10.941831 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.941811 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:10.943986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.943957 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-1f653-predictor-serving-cert\"" Apr 24 21:52:10.943986 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.943970 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\"" Apr 24 21:52:10.950637 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:10.950614 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:52:11.060641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.060565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wjm\" (UniqueName: \"kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.060802 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.060651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.060802 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.060699 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.060955 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.060933 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.161569 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.161517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.161721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.161578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.161721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.161628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.161721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.161648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88wjm\" (UniqueName: \"kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.161908 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:52:11.161739 2567 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-serving-cert: secret "raw-sklearn-runtime-1f653-predictor-serving-cert" not found Apr 24 21:52:11.161908 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:52:11.161839 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls podName:f59d6c94-3929-4f95-b202-40c968235482 nodeName:}" failed. No retries permitted until 2026-04-24 21:52:11.661817209 +0000 UTC m=+1416.203768800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls") pod "raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" (UID: "f59d6c94-3929-4f95-b202-40c968235482") : secret "raw-sklearn-runtime-1f653-predictor-serving-cert" not found Apr 24 21:52:11.162054 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.162004 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.162195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.162177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.171910 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.171882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wjm\" (UniqueName: \"kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.642006 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.641968 2567 generic.go:358] "Generic (PLEG): container finished" podID="b033b505-1678-4e29-bb5b-ae293c77330c" containerID="a82b719cc651db850d77936995999644ab5e7862e4960800fdff128d1ae11921" exitCode=2 Apr 24 21:52:11.642167 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.642043 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerDied","Data":"a82b719cc651db850d77936995999644ab5e7862e4960800fdff128d1ae11921"} Apr 24 21:52:11.665330 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.665306 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.667666 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.667647 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") pod \"raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.852877 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.852846 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:11.974463 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:11.974439 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:52:11.976771 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:52:11.976733 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf59d6c94_3929_4f95_b202_40c968235482.slice/crio-aec29aabaa811ecccff6dc8cdeef409dd2258f0e52daab93c3bc4f732c6a7cb6 WatchSource:0}: Error finding container aec29aabaa811ecccff6dc8cdeef409dd2258f0e52daab93c3bc4f732c6a7cb6: Status 404 returned error can't find the container with id aec29aabaa811ecccff6dc8cdeef409dd2258f0e52daab93c3bc4f732c6a7cb6 Apr 24 21:52:12.646932 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:12.646891 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerStarted","Data":"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0"} Apr 24 21:52:12.646932 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:12.646927 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerStarted","Data":"aec29aabaa811ecccff6dc8cdeef409dd2258f0e52daab93c3bc4f732c6a7cb6"} Apr 24 21:52:13.388711 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:13.388666 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 21:52:13.392953 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:13.392921 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 21:52:14.656110 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.656079 2567 generic.go:358] "Generic (PLEG): container finished" podID="b033b505-1678-4e29-bb5b-ae293c77330c" containerID="ea8c56f71e50166e925eacff9b26ebbf3460372f84146f2fe0e2fb98ad4adfc2" exitCode=0 Apr 24 21:52:14.656444 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.656159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerDied","Data":"ea8c56f71e50166e925eacff9b26ebbf3460372f84146f2fe0e2fb98ad4adfc2"} Apr 24 21:52:14.732954 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.732930 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:52:14.791004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.790942 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls\") pod \"b033b505-1678-4e29-bb5b-ae293c77330c\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " Apr 24 21:52:14.791004 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.790970 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmxgb\" (UniqueName: \"kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb\") pod \"b033b505-1678-4e29-bb5b-ae293c77330c\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " Apr 24 21:52:14.791175 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.791029 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location\") pod \"b033b505-1678-4e29-bb5b-ae293c77330c\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " Apr 24 21:52:14.791175 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.791070 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config\") pod \"b033b505-1678-4e29-bb5b-ae293c77330c\" (UID: \"b033b505-1678-4e29-bb5b-ae293c77330c\") " Apr 24 21:52:14.791397 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.791378 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b033b505-1678-4e29-bb5b-ae293c77330c" (UID: "b033b505-1678-4e29-bb5b-ae293c77330c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:14.791468 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.791423 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-1ac54-kube-rbac-proxy-sar-config") pod "b033b505-1678-4e29-bb5b-ae293c77330c" (UID: "b033b505-1678-4e29-bb5b-ae293c77330c"). InnerVolumeSpecName "raw-sklearn-1ac54-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:52:14.792899 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.792877 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b033b505-1678-4e29-bb5b-ae293c77330c" (UID: "b033b505-1678-4e29-bb5b-ae293c77330c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:14.792998 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.792967 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb" (OuterVolumeSpecName: "kube-api-access-nmxgb") pod "b033b505-1678-4e29-bb5b-ae293c77330c" (UID: "b033b505-1678-4e29-bb5b-ae293c77330c"). InnerVolumeSpecName "kube-api-access-nmxgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:14.891880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.891858 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b033b505-1678-4e29-bb5b-ae293c77330c-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:52:14.891880 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.891879 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmxgb\" (UniqueName: \"kubernetes.io/projected/b033b505-1678-4e29-bb5b-ae293c77330c-kube-api-access-nmxgb\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:52:14.892014 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.891888 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b033b505-1678-4e29-bb5b-ae293c77330c-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:52:14.892014 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:14.891901 2567 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-1ac54-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b033b505-1678-4e29-bb5b-ae293c77330c-raw-sklearn-1ac54-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:52:15.663299 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.663269 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" event={"ID":"b033b505-1678-4e29-bb5b-ae293c77330c","Type":"ContainerDied","Data":"6481464f0035d7e06b90eddc72fa11fa384c631c79266fa132b25f524c9d18e7"} Apr 24 21:52:15.663721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.663321 2567 scope.go:117] "RemoveContainer" containerID="a82b719cc651db850d77936995999644ab5e7862e4960800fdff128d1ae11921" Apr 24 21:52:15.663721 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.663276 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf" Apr 24 21:52:15.664725 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.664706 2567 generic.go:358] "Generic (PLEG): container finished" podID="f59d6c94-3929-4f95-b202-40c968235482" containerID="e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0" exitCode=0 Apr 24 21:52:15.664800 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.664767 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerDied","Data":"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0"} Apr 24 21:52:15.671396 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.671362 2567 scope.go:117] "RemoveContainer" containerID="ea8c56f71e50166e925eacff9b26ebbf3460372f84146f2fe0e2fb98ad4adfc2" Apr 24 21:52:15.679088 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.679069 2567 scope.go:117] "RemoveContainer" containerID="e9f33680adbc3b8bfb3a970685316067c265ac5a62c0dac41094696a0b7328e2" Apr 24 21:52:15.700234 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.700187 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:52:15.704918 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.704899 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-1ac54-predictor-995c89dc6-45xgf"] Apr 24 21:52:15.988328 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:15.988246 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" path="/var/lib/kubelet/pods/b033b505-1678-4e29-bb5b-ae293c77330c/volumes" Apr 24 21:52:16.670388 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.670355 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerStarted","Data":"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf"} Apr 24 21:52:16.670388 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.670397 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerStarted","Data":"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c"} Apr 24 21:52:16.670933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.670721 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:16.670933 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.670841 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:16.671964 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.671934 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:52:16.692276 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:16.692236 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podStartSLOduration=6.692223245 podStartE2EDuration="6.692223245s" podCreationTimestamp="2026-04-24 21:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:52:16.690982317 +0000 UTC m=+1421.232933915" watchObservedRunningTime="2026-04-24 21:52:16.692223245 +0000 UTC m=+1421.234174844" Apr 24 21:52:17.674318 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:17.674281 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:52:22.678533 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:22.678502 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:52:22.679031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:22.679005 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:52:32.679132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:32.679054 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:52:42.679589 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:42.679548 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:52:52.679173 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:52:52.679132 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:53:02.679621 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:02.679587 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:53:12.679316 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:12.679274 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:53:22.679458 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:22.679431 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:53:30.954573 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:30.954537 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:53:30.954984 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:30.954953 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" containerID="cri-o://83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c" gracePeriod=30 Apr 24 21:53:30.955045 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:30.954993 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kube-rbac-proxy" containerID="cri-o://43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf" gracePeriod=30 Apr 24 21:53:31.926195 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:31.926161 2567 generic.go:358] "Generic (PLEG): container finished" podID="f59d6c94-3929-4f95-b202-40c968235482" containerID="43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf" exitCode=2 Apr 24 21:53:31.926378 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:31.926246 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerDied","Data":"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf"} Apr 24 21:53:32.674875 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:32.674833 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 24 21:53:32.679147 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:32.679120 2567 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 21:53:34.697636 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.697615 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:53:34.817657 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.817597 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88wjm\" (UniqueName: \"kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm\") pod \"f59d6c94-3929-4f95-b202-40c968235482\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " Apr 24 21:53:34.817657 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.817631 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") pod \"f59d6c94-3929-4f95-b202-40c968235482\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " Apr 24 21:53:34.817865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.817667 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location\") pod \"f59d6c94-3929-4f95-b202-40c968235482\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " Apr 24 21:53:34.817865 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.817698 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\") pod \"f59d6c94-3929-4f95-b202-40c968235482\" (UID: \"f59d6c94-3929-4f95-b202-40c968235482\") " Apr 24 21:53:34.818040 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.818007 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f59d6c94-3929-4f95-b202-40c968235482" (UID: "f59d6c94-3929-4f95-b202-40c968235482"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:34.818151 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.818069 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config") pod "f59d6c94-3929-4f95-b202-40c968235482" (UID: "f59d6c94-3929-4f95-b202-40c968235482"). InnerVolumeSpecName "raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:53:34.819641 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.819619 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm" (OuterVolumeSpecName: "kube-api-access-88wjm") pod "f59d6c94-3929-4f95-b202-40c968235482" (UID: "f59d6c94-3929-4f95-b202-40c968235482"). InnerVolumeSpecName "kube-api-access-88wjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:53:34.819734 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.819618 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f59d6c94-3929-4f95-b202-40c968235482" (UID: "f59d6c94-3929-4f95-b202-40c968235482"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:53:34.918236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.918213 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-88wjm\" (UniqueName: \"kubernetes.io/projected/f59d6c94-3929-4f95-b202-40c968235482-kube-api-access-88wjm\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:53:34.918236 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.918236 2567 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f59d6c94-3929-4f95-b202-40c968235482-proxy-tls\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:53:34.918369 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.918246 2567 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f59d6c94-3929-4f95-b202-40c968235482-kserve-provision-location\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:53:34.918369 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.918255 2567 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f59d6c94-3929-4f95-b202-40c968235482-raw-sklearn-runtime-1f653-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-249.ec2.internal\" DevicePath \"\"" Apr 24 21:53:34.938244 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.938216 2567 generic.go:358] "Generic (PLEG): container finished" podID="f59d6c94-3929-4f95-b202-40c968235482" containerID="83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c" exitCode=0 Apr 24 21:53:34.938338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.938288 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" Apr 24 21:53:34.938338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.938296 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerDied","Data":"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c"} Apr 24 21:53:34.938338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.938335 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7" event={"ID":"f59d6c94-3929-4f95-b202-40c968235482","Type":"ContainerDied","Data":"aec29aabaa811ecccff6dc8cdeef409dd2258f0e52daab93c3bc4f732c6a7cb6"} Apr 24 21:53:34.938443 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.938351 2567 scope.go:117] "RemoveContainer" containerID="43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf" Apr 24 21:53:34.947228 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.947162 2567 scope.go:117] "RemoveContainer" containerID="83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c" Apr 24 21:53:34.954266 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.954249 2567 scope.go:117] "RemoveContainer" containerID="e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0" Apr 24 21:53:34.961307 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.961288 2567 scope.go:117] "RemoveContainer" containerID="43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf" Apr 24 21:53:34.961604 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:53:34.961581 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf\": container with ID starting with 43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf not found: ID does not exist" containerID="43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf" Apr 24 21:53:34.961689 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.961617 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf"} err="failed to get container status \"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf\": rpc error: code = NotFound desc = could not find container \"43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf\": container with ID starting with 43326acc3fb6e99db097d5b2b24c2d2e9c1b98876014d93287762aea1428d0cf not found: ID does not exist" Apr 24 21:53:34.961689 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.961643 2567 scope.go:117] "RemoveContainer" containerID="83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c" Apr 24 21:53:34.961937 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:53:34.961915 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c\": container with ID starting with 83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c not found: ID does not exist" containerID="83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c" Apr 24 21:53:34.962020 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.961942 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c"} err="failed to get container status \"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c\": rpc error: code = NotFound desc = could not find container \"83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c\": container with ID starting with 83cae38027320650183e184e2c6e468dd370d564cfcad49200124842dffcae6c not found: ID does not exist" Apr 24 21:53:34.962020 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.961959 2567 scope.go:117] "RemoveContainer" containerID="e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0" Apr 24 21:53:34.962240 ip-10-0-134-249 kubenswrapper[2567]: E0424 21:53:34.962221 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0\": container with ID starting with e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0 not found: ID does not exist" containerID="e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0" Apr 24 21:53:34.962290 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.962245 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0"} err="failed to get container status \"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0\": rpc error: code = NotFound desc = could not find container \"e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0\": container with ID starting with e7f35b79b66047572ee27fd9d6c1f9f76af2618fc43c94efc40fcba5784800a0 not found: ID does not exist" Apr 24 21:53:34.962329 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.962281 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:53:34.969922 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:34.969903 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-1f653-predictor-7c88d7c964-nc9t7"] Apr 24 21:53:35.982366 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:35.982334 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59d6c94-3929-4f95-b202-40c968235482" path="/var/lib/kubelet/pods/f59d6c94-3929-4f95-b202-40c968235482/volumes" Apr 24 21:53:35.998453 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:35.998424 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:53:36.002586 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:36.002567 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:53:59.314966 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:59.314896 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mxqrb_67ffea21-d8d4-46a1-ab6a-9c97c0cb589e/global-pull-secret-syncer/0.log" Apr 24 21:53:59.389275 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:59.389249 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g6b6t_eea02f54-e68c-472a-b138-f0b60cf3f2b8/konnectivity-agent/0.log" Apr 24 21:53:59.435851 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:53:59.435832 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-249.ec2.internal_6e6f541dfff28fedc7fdca7f3c5d9590/haproxy/0.log" Apr 24 21:54:02.724322 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.724276 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tjtvx_51955aae-3c73-4c5a-9e8c-e93d7e1ed29d/cluster-monitoring-operator/0.log" Apr 24 21:54:02.749518 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.749471 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ws6r5_8ab5b171-1976-4834-9bdc-b77f8f597ceb/kube-state-metrics/0.log" Apr 24 21:54:02.771354 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.771332 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ws6r5_8ab5b171-1976-4834-9bdc-b77f8f597ceb/kube-rbac-proxy-main/0.log" Apr 24 21:54:02.795283 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.795263 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ws6r5_8ab5b171-1976-4834-9bdc-b77f8f597ceb/kube-rbac-proxy-self/0.log" Apr 24 21:54:02.826064 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.826042 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-68995464cf-q4lvq_0102fb1c-fbc0-464f-8967-c430cf0ee1df/metrics-server/0.log" Apr 24 21:54:02.884252 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.884232 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bmn2_f029b748-78df-447d-8158-9a6cca578bb4/node-exporter/0.log" Apr 24 21:54:02.909785 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.909720 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bmn2_f029b748-78df-447d-8158-9a6cca578bb4/kube-rbac-proxy/0.log" Apr 24 21:54:02.933339 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:02.933318 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2bmn2_f029b748-78df-447d-8158-9a6cca578bb4/init-textfile/0.log" Apr 24 21:54:03.410338 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:03.410307 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v8qkr_700d16c7-e5f3-4120-acf4-338514077b97/prometheus-operator/0.log" Apr 24 21:54:03.431864 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:03.431844 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-v8qkr_700d16c7-e5f3-4120-acf4-338514077b97/kube-rbac-proxy/0.log" Apr 24 21:54:03.458167 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:03.458147 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-gmklq_35b673bf-f435-46d8-a296-c4719d80ee9d/prometheus-operator-admission-webhook/0.log" Apr 24 21:54:04.883997 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:04.883966 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-f8hxj_6b55e3f0-b53c-4afe-88de-6b0ada988fc9/networking-console-plugin/0.log" Apr 24 21:54:05.691126 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:05.691097 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556f786b98-9qsjb_d8ab2221-fd98-4f74-98bf-6a415c4994a6/console/0.log" Apr 24 21:54:05.719637 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:05.719611 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-qrhbm_03c30a98-1967-4567-88ca-9f2cc397987a/download-server/0.log" Apr 24 21:54:06.015453 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015380 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s"] Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015720 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015730 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015739 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="storage-initializer" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015746 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="storage-initializer" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015755 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="storage-initializer" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015762 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="storage-initializer" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015775 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kube-rbac-proxy" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015780 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kube-rbac-proxy" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015795 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" Apr 24 21:54:06.015799 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015801 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015812 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kube-rbac-proxy" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015817 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kube-rbac-proxy" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015870 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kserve-container" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015879 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kserve-container" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015887 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="b033b505-1678-4e29-bb5b-ae293c77330c" containerName="kube-rbac-proxy" Apr 24 21:54:06.016111 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.015893 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="f59d6c94-3929-4f95-b202-40c968235482" containerName="kube-rbac-proxy" Apr 24 21:54:06.018960 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.018939 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.021123 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.021105 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"openshift-service-ca.crt\"" Apr 24 21:54:06.021220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.021137 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fdcwc\"/\"kube-root-ca.crt\"" Apr 24 21:54:06.021809 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.021789 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-fdcwc\"/\"default-dockercfg-gd25x\"" Apr 24 21:54:06.028705 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.028685 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s"] Apr 24 21:54:06.036870 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.036848 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-proc\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.036969 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.036911 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-podres\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.037031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.036976 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-lib-modules\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.037031 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.037011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf7z5\" (UniqueName: \"kubernetes.io/projected/aef83213-42f0-495e-b48a-e67d3b1e84e3-kube-api-access-gf7z5\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.037132 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.037060 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-sys\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.137717 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.137696 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5hcrl_5acaffdc-65cb-4b11-a572-3f3d38f308c0/volume-data-source-validator/0.log" Apr 24 21:54:06.141910 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.141882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-proc\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.141931 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-podres\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.141954 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-lib-modules\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142007 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.141982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf7z5\" (UniqueName: \"kubernetes.io/projected/aef83213-42f0-495e-b48a-e67d3b1e84e3-kube-api-access-gf7z5\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.142009 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-proc\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.142034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-sys\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.142074 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-podres\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.142100 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-sys\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.142174 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.142125 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aef83213-42f0-495e-b48a-e67d3b1e84e3-lib-modules\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.150221 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.150193 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf7z5\" (UniqueName: \"kubernetes.io/projected/aef83213-42f0-495e-b48a-e67d3b1e84e3-kube-api-access-gf7z5\") pod \"perf-node-gather-daemonset-72f6s\" (UID: \"aef83213-42f0-495e-b48a-e67d3b1e84e3\") " pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.329420 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.329338 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:06.448333 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.448311 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s"] Apr 24 21:54:06.450858 ip-10-0-134-249 kubenswrapper[2567]: W0424 21:54:06.450834 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaef83213_42f0_495e_b48a_e67d3b1e84e3.slice/crio-4b51197cbc8a3d867e6124a232157f3171253306f53fd36e9eaccf6fb3a61d06 WatchSource:0}: Error finding container 4b51197cbc8a3d867e6124a232157f3171253306f53fd36e9eaccf6fb3a61d06: Status 404 returned error can't find the container with id 4b51197cbc8a3d867e6124a232157f3171253306f53fd36e9eaccf6fb3a61d06 Apr 24 21:54:06.860291 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.860261 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5tp2b_6a5a230a-7dee-4c4a-882c-d0cb1e017e43/dns/0.log" Apr 24 21:54:06.882481 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:06.882459 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5tp2b_6a5a230a-7dee-4c4a-882c-d0cb1e017e43/kube-rbac-proxy/0.log" Apr 24 21:54:07.050580 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.050557 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" event={"ID":"aef83213-42f0-495e-b48a-e67d3b1e84e3","Type":"ContainerStarted","Data":"68c8009685e219d2ca80a92adc92b8072203b49c34310c0dfa03aceb529d7bb1"} Apr 24 21:54:07.050912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.050586 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" event={"ID":"aef83213-42f0-495e-b48a-e67d3b1e84e3","Type":"ContainerStarted","Data":"4b51197cbc8a3d867e6124a232157f3171253306f53fd36e9eaccf6fb3a61d06"} Apr 24 21:54:07.050912 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.050674 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:07.066922 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.066881 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" podStartSLOduration=1.0668703449999999 podStartE2EDuration="1.066870345s" podCreationTimestamp="2026-04-24 21:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:54:07.066025391 +0000 UTC m=+1531.607977001" watchObservedRunningTime="2026-04-24 21:54:07.066870345 +0000 UTC m=+1531.608821943" Apr 24 21:54:07.126355 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.126295 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xdhfk_f3c0ee91-65bf-4825-9def-c62c4806cc59/dns-node-resolver/0.log" Apr 24 21:54:07.697571 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:07.697519 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qj6pk_58290685-bbba-48e3-936e-34b4f4d27034/node-ca/0.log" Apr 24 21:54:08.424079 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:08.424044 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6fb67cdcfb-x2njp_68ab58ce-0982-4fa9-91b7-027cc0ef3bb7/router/0.log" Apr 24 21:54:08.853463 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:08.853440 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rrff5_3b2d31d7-10c7-40a8-ba71-bb0eb1d00f2f/serve-healthcheck-canary/0.log" Apr 24 21:54:09.223264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:09.223193 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-crpwz_32477472-4713-476c-ac5d-4d2a735ad4b7/insights-operator/1.log" Apr 24 21:54:09.223264 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:09.223220 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-crpwz_32477472-4713-476c-ac5d-4d2a735ad4b7/insights-operator/0.log" Apr 24 21:54:09.315512 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:09.315473 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrc6_29129b49-ae5a-46a2-b952-958acbcd5d52/kube-rbac-proxy/0.log" Apr 24 21:54:09.341200 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:09.341179 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrc6_29129b49-ae5a-46a2-b952-958acbcd5d52/exporter/0.log" Apr 24 21:54:09.363340 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:09.363316 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ftrc6_29129b49-ae5a-46a2-b952-958acbcd5d52/extractor/0.log" Apr 24 21:54:11.482706 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:11.482678 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-n5pfz_9256e237-2f00-4a32-9878-6afad79c86d5/manager/0.log" Apr 24 21:54:11.530559 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:11.530518 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-zfqls_186b84da-4fba-40a2-8597-bda518c452d3/seaweedfs/0.log" Apr 24 21:54:13.063357 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:13.063328 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-fdcwc/perf-node-gather-daemonset-72f6s" Apr 24 21:54:15.595765 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:15.595741 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v2n27_090e54a8-26ef-4bf1-ae67-483015286e44/migrator/0.log" Apr 24 21:54:15.620050 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:15.620029 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-v2n27_090e54a8-26ef-4bf1-ae67-483015286e44/graceful-termination/0.log" Apr 24 21:54:15.984160 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:15.984130 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f22xr_23da5cf6-4806-441a-8d04-9ddc0c84d07b/kube-storage-version-migrator-operator/1.log" Apr 24 21:54:15.985220 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:15.985194 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f22xr_23da5cf6-4806-441a-8d04-9ddc0c84d07b/kube-storage-version-migrator-operator/0.log" Apr 24 21:54:17.376114 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.376085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/kube-multus-additional-cni-plugins/0.log" Apr 24 21:54:17.399109 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.399087 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/egress-router-binary-copy/0.log" Apr 24 21:54:17.421309 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.421291 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/cni-plugins/0.log" Apr 24 21:54:17.442770 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.442753 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/bond-cni-plugin/0.log" Apr 24 21:54:17.467109 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.467087 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/routeoverride-cni/0.log" Apr 24 21:54:17.496153 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.496134 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/whereabouts-cni-bincopy/0.log" Apr 24 21:54:17.522197 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.522141 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7f8g_13ab36a9-43a5-4f65-a1cc-a42a4f3c183d/whereabouts-cni/0.log" Apr 24 21:54:17.621326 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.621305 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vjdz9_7e0b111b-0c8c-40e0-838f-d5768a4fd67a/kube-multus/0.log" Apr 24 21:54:17.652047 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.652024 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fnmhx_b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58/network-metrics-daemon/0.log" Apr 24 21:54:17.676423 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:17.676404 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fnmhx_b8d9490d-3a97-4e4b-b1b1-fbbd994c8a58/kube-rbac-proxy/0.log" Apr 24 21:54:19.081936 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.081904 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-controller/0.log" Apr 24 21:54:19.100943 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.100919 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/0.log" Apr 24 21:54:19.107099 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.107078 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovn-acl-logging/1.log" Apr 24 21:54:19.129792 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.129769 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/kube-rbac-proxy-node/0.log" Apr 24 21:54:19.160036 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.160013 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:54:19.198888 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.198872 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/northd/0.log" Apr 24 21:54:19.222154 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.222135 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/nbdb/0.log" Apr 24 21:54:19.262451 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.262431 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/sbdb/0.log" Apr 24 21:54:19.363486 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:19.363460 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qr6dh_14209990-6df9-445b-a825-ae10b8c6b84d/ovnkube-controller/0.log" Apr 24 21:54:20.341413 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:20.341377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mxwm9_8054fc2a-a4f1-4d30-9017-96d3932c580f/network-check-target-container/0.log" Apr 24 21:54:21.346244 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:21.346214 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-q6trf_54414862-ab41-4d71-8c87-40ce2fe45ac4/iptables-alerter/0.log" Apr 24 21:54:22.033780 ip-10-0-134-249 kubenswrapper[2567]: I0424 21:54:22.033752 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pksbn_755f7e1a-7e39-472a-9a15-3ecbce3571b8/tuned/0.log"